Feb 20 01:37:22 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Feb 20 01:37:22 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Feb 20 01:37:22 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 20 01:37:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 20 01:37:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 20 01:37:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 20 01:37:22 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 20 01:37:22 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Feb 20 01:37:22 localhost kernel: signal: max sigframe size: 1776 Feb 20 01:37:22 localhost kernel: BIOS-provided physical RAM map: Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 20 01:37:22 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Feb 20 01:37:22 localhost kernel: NX (Execute Disable) protection: active Feb 20 01:37:22 localhost kernel: SMBIOS 2.8 present. Feb 20 01:37:22 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Feb 20 01:37:22 localhost kernel: Hypervisor detected: KVM Feb 20 01:37:22 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 20 01:37:22 localhost kernel: kvm-clock: using sched offset of 2856977710 cycles Feb 20 01:37:22 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 20 01:37:22 localhost kernel: tsc: Detected 2799.998 MHz processor Feb 20 01:37:22 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Feb 20 01:37:22 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 20 01:37:22 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Feb 20 01:37:22 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Feb 20 01:37:22 localhost kernel: Using GB pages for direct mapping Feb 20 01:37:22 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Feb 20 01:37:22 localhost kernel: ACPI: Early table checksum verification disabled Feb 20 01:37:22 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Feb 20 01:37:22 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 20 01:37:22 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 20 01:37:22 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 20 01:37:22 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Feb 20 01:37:22 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 20 01:37:22 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 20 01:37:22 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Feb 20 01:37:22 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Feb 20 01:37:22 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Feb 20 01:37:22 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Feb 20 01:37:22 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Feb 20 01:37:22 localhost kernel: No NUMA configuration found Feb 20 01:37:22 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Feb 20 01:37:22 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Feb 20 01:37:22 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Feb 20 01:37:22 localhost kernel: Zone ranges: Feb 20 01:37:22 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 20 01:37:22 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 20 01:37:22 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Feb 20 01:37:22 localhost kernel: Device empty Feb 20 01:37:22 localhost kernel: Movable zone start for each node Feb 20 01:37:22 localhost kernel: Early memory node ranges Feb 20 01:37:22 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 20 01:37:22 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Feb 20 01:37:22 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Feb 20 01:37:22 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Feb 20 01:37:22 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 20 01:37:22 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 20 01:37:22 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Feb 20 01:37:22 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Feb 20 01:37:22 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 20 01:37:22 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Feb 20 01:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 20 01:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 20 01:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 20 01:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 20 01:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 20 01:37:22 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 20 01:37:22 localhost kernel: TSC deadline timer available Feb 20 01:37:22 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Feb 20 01:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Feb 20 01:37:22 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Feb 20 01:37:22 localhost kernel: Booting paravirtualized kernel on KVM Feb 20 01:37:22 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 20 01:37:22 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Feb 20 01:37:22 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Feb 20 01:37:22 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Feb 20 01:37:22 localhost kernel: Fallback order for Node 0: 0 Feb 20 01:37:22 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Feb 20 01:37:22 localhost kernel: Policy zone: Normal Feb 20 01:37:22 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 20 01:37:22 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Feb 20 01:37:22 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 20 01:37:22 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Feb 20 01:37:22 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 20 01:37:22 localhost kernel: software IO TLB: area num 8. Feb 20 01:37:22 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Feb 20 01:37:22 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Feb 20 01:37:22 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Feb 20 01:37:22 localhost kernel: ftrace: allocating 44803 entries in 176 pages Feb 20 01:37:22 localhost kernel: ftrace: allocated 176 pages with 3 groups Feb 20 01:37:22 localhost kernel: Dynamic Preempt: voluntary Feb 20 01:37:22 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Feb 20 01:37:22 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Feb 20 01:37:22 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Feb 20 01:37:22 localhost kernel: #011Rude variant of Tasks RCU enabled. Feb 20 01:37:22 localhost kernel: #011Tracing variant of Tasks RCU enabled. Feb 20 01:37:22 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 20 01:37:22 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Feb 20 01:37:22 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Feb 20 01:37:22 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 20 01:37:22 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Feb 20 01:37:22 localhost kernel: random: crng init done (trusting CPU's manufacturer) Feb 20 01:37:22 localhost kernel: Console: colour VGA+ 80x25 Feb 20 01:37:22 localhost kernel: printk: console [tty0] enabled Feb 20 01:37:22 localhost kernel: printk: console [ttyS0] enabled Feb 20 01:37:22 localhost kernel: ACPI: Core revision 20211217 Feb 20 01:37:22 localhost kernel: APIC: Switch to symmetric I/O mode setup Feb 20 01:37:22 localhost kernel: x2apic enabled Feb 20 01:37:22 localhost kernel: Switched APIC routing to physical x2apic. Feb 20 01:37:22 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Feb 20 01:37:22 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Feb 20 01:37:22 localhost kernel: pid_max: default: 32768 minimum: 301 Feb 20 01:37:22 localhost kernel: LSM: Security Framework initializing Feb 20 01:37:22 localhost kernel: Yama: becoming mindful. Feb 20 01:37:22 localhost kernel: SELinux: Initializing. Feb 20 01:37:22 localhost kernel: LSM support for eBPF active Feb 20 01:37:22 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 20 01:37:22 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 20 01:37:22 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Feb 20 01:37:22 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Feb 20 01:37:22 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Feb 20 01:37:22 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 20 01:37:22 localhost kernel: Spectre V2 : Mitigation: Retpolines Feb 20 01:37:22 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 20 01:37:22 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 20 01:37:22 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Feb 20 01:37:22 localhost kernel: RETBleed: Mitigation: untrained return thunk Feb 20 01:37:22 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 20 01:37:22 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 20 01:37:22 localhost kernel: Freeing SMP alternatives memory: 36K Feb 20 01:37:22 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Feb 20 01:37:22 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Feb 20 01:37:22 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 20 01:37:22 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 20 01:37:22 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 20 01:37:22 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Feb 20 01:37:22 localhost kernel: ... version: 0 Feb 20 01:37:22 localhost kernel: ... bit width: 48 Feb 20 01:37:22 localhost kernel: ... generic registers: 6 Feb 20 01:37:22 localhost kernel: ... value mask: 0000ffffffffffff Feb 20 01:37:22 localhost kernel: ... max period: 00007fffffffffff Feb 20 01:37:22 localhost kernel: ... fixed-purpose events: 0 Feb 20 01:37:22 localhost kernel: ... event mask: 000000000000003f Feb 20 01:37:22 localhost kernel: rcu: Hierarchical SRCU implementation. Feb 20 01:37:22 localhost kernel: rcu: #011Max phase no-delay instances is 400. Feb 20 01:37:22 localhost kernel: smp: Bringing up secondary CPUs ... Feb 20 01:37:22 localhost kernel: x86: Booting SMP configuration: Feb 20 01:37:22 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Feb 20 01:37:22 localhost kernel: smp: Brought up 1 node, 8 CPUs Feb 20 01:37:22 localhost kernel: smpboot: Max logical packages: 8 Feb 20 01:37:22 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Feb 20 01:37:22 localhost kernel: node 0 deferred pages initialised in 23ms Feb 20 01:37:22 localhost kernel: devtmpfs: initialized Feb 20 01:37:22 localhost kernel: x86/mm: Memory block size: 128MB Feb 20 01:37:22 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 20 01:37:22 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Feb 20 01:37:22 localhost kernel: pinctrl core: initialized pinctrl subsystem Feb 20 01:37:22 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 20 01:37:22 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Feb 20 01:37:22 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Feb 20 01:37:22 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Feb 20 01:37:22 localhost kernel: audit: initializing netlink subsys (disabled) Feb 20 01:37:22 localhost kernel: audit: type=2000 audit(1771569440.456:1): state=initialized audit_enabled=0 res=1 Feb 20 01:37:22 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Feb 20 01:37:22 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 20 01:37:22 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Feb 20 01:37:22 localhost kernel: cpuidle: using governor menu Feb 20 01:37:22 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Feb 20 01:37:22 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 20 01:37:22 localhost kernel: PCI: Using configuration type 1 for base access Feb 20 01:37:22 localhost kernel: PCI: Using configuration type 1 for extended access Feb 20 01:37:22 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 20 01:37:22 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Feb 20 01:37:22 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 20 01:37:22 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 20 01:37:22 localhost kernel: cryptd: max_cpu_qlen set to 1000 Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(Module Device) Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(Processor Device) Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 20 01:37:22 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 20 01:37:22 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 20 01:37:22 localhost kernel: ACPI: Interpreter enabled Feb 20 01:37:22 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Feb 20 01:37:22 localhost kernel: ACPI: Using IOAPIC for interrupt routing Feb 20 01:37:22 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 20 01:37:22 localhost kernel: PCI: Using E820 reservations for host bridge windows Feb 20 01:37:22 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Feb 20 01:37:22 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 20 01:37:22 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Feb 20 01:37:22 localhost kernel: acpiphp: Slot [3] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [4] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [5] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [6] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [7] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [8] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [9] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [10] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [11] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [12] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [13] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [14] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [15] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [16] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [17] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [18] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [19] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [20] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [21] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [22] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [23] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [24] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [25] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [26] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [27] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [28] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [29] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [30] registered Feb 20 01:37:22 localhost kernel: acpiphp: Slot [31] registered Feb 20 01:37:22 localhost kernel: PCI host bridge to bus 0000:00 Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 20 01:37:22 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Feb 20 01:37:22 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Feb 20 01:37:22 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Feb 20 01:37:22 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Feb 20 01:37:22 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Feb 20 01:37:22 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Feb 20 01:37:22 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 20 01:37:22 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Feb 20 01:37:22 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Feb 20 01:37:22 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Feb 20 01:37:22 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Feb 20 01:37:22 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Feb 20 01:37:22 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Feb 20 01:37:22 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Feb 20 01:37:22 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Feb 20 01:37:22 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Feb 20 01:37:22 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Feb 20 01:37:22 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Feb 20 01:37:22 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Feb 20 01:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 20 01:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 20 01:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 20 01:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 20 01:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Feb 20 01:37:22 localhost kernel: iommu: Default domain type: Translated Feb 20 01:37:22 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 20 01:37:22 localhost kernel: SCSI subsystem initialized Feb 20 01:37:22 localhost kernel: ACPI: bus type USB registered Feb 20 01:37:22 localhost kernel: usbcore: registered new interface driver usbfs Feb 20 01:37:22 localhost kernel: usbcore: registered new interface driver hub Feb 20 01:37:22 localhost kernel: usbcore: registered new device driver usb Feb 20 01:37:22 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Feb 20 01:37:22 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 20 01:37:22 localhost kernel: PTP clock support registered Feb 20 01:37:22 localhost kernel: EDAC MC: Ver: 3.0.0 Feb 20 01:37:22 localhost kernel: NetLabel: Initializing Feb 20 01:37:22 localhost kernel: NetLabel: domain hash size = 128 Feb 20 01:37:22 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Feb 20 01:37:22 localhost kernel: NetLabel: unlabeled traffic allowed by default Feb 20 01:37:22 localhost kernel: PCI: Using ACPI for IRQ routing Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Feb 20 01:37:22 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 20 01:37:22 localhost kernel: vgaarb: loaded Feb 20 01:37:22 localhost kernel: clocksource: Switched to clocksource kvm-clock Feb 20 01:37:22 localhost kernel: VFS: Disk quotas dquot_6.6.0 Feb 20 01:37:22 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 20 01:37:22 localhost kernel: pnp: PnP ACPI init Feb 20 01:37:22 localhost kernel: pnp: PnP ACPI: found 5 devices Feb 20 01:37:22 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 20 01:37:22 localhost kernel: NET: Registered PF_INET protocol family Feb 20 01:37:22 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 20 01:37:22 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Feb 20 01:37:22 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 20 01:37:22 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 20 01:37:22 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 20 01:37:22 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Feb 20 01:37:22 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Feb 20 01:37:22 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Feb 20 01:37:22 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Feb 20 01:37:22 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 20 01:37:22 localhost kernel: NET: Registered PF_XDP protocol family Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Feb 20 01:37:22 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Feb 20 01:37:22 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Feb 20 01:37:22 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 20 01:37:22 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Feb 20 01:37:22 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27140 usecs Feb 20 01:37:22 localhost kernel: PCI: CLS 0 bytes, default 64 Feb 20 01:37:22 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 20 01:37:22 localhost kernel: Trying to unpack rootfs image as initramfs... Feb 20 01:37:22 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Feb 20 01:37:22 localhost kernel: ACPI: bus type thunderbolt registered Feb 20 01:37:22 localhost kernel: Initialise system trusted keyrings Feb 20 01:37:22 localhost kernel: Key type blacklist registered Feb 20 01:37:22 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Feb 20 01:37:22 localhost kernel: zbud: loaded Feb 20 01:37:22 localhost kernel: integrity: Platform Keyring initialized Feb 20 01:37:22 localhost kernel: NET: Registered PF_ALG protocol family Feb 20 01:37:22 localhost kernel: xor: automatically using best checksumming function avx Feb 20 01:37:22 localhost kernel: Key type asymmetric registered Feb 20 01:37:22 localhost kernel: Asymmetric key parser 'x509' registered Feb 20 01:37:22 localhost kernel: Running certificate verification selftests Feb 20 01:37:22 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Feb 20 01:37:22 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Feb 20 01:37:22 localhost kernel: io scheduler mq-deadline registered Feb 20 01:37:22 localhost kernel: io scheduler kyber registered Feb 20 01:37:22 localhost kernel: io scheduler bfq registered Feb 20 01:37:22 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Feb 20 01:37:22 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Feb 20 01:37:22 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Feb 20 01:37:22 localhost kernel: ACPI: button: Power Button [PWRF] Feb 20 01:37:22 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Feb 20 01:37:22 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Feb 20 01:37:22 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Feb 20 01:37:22 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 20 01:37:22 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 20 01:37:22 localhost kernel: Non-volatile memory driver v1.3 Feb 20 01:37:22 localhost kernel: rdac: device handler registered Feb 20 01:37:22 localhost kernel: hp_sw: device handler registered Feb 20 01:37:22 localhost kernel: emc: device handler registered Feb 20 01:37:22 localhost kernel: alua: device handler registered Feb 20 01:37:22 localhost kernel: libphy: Fixed MDIO Bus: probed Feb 20 01:37:22 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Feb 20 01:37:22 localhost kernel: ehci-pci: EHCI PCI platform driver Feb 20 01:37:22 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Feb 20 01:37:22 localhost kernel: ohci-pci: OHCI PCI platform driver Feb 20 01:37:22 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Feb 20 01:37:22 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Feb 20 01:37:22 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Feb 20 01:37:22 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Feb 20 01:37:22 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Feb 20 01:37:22 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Feb 20 01:37:22 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Feb 20 01:37:22 localhost kernel: usb usb1: Product: UHCI Host Controller Feb 20 01:37:22 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Feb 20 01:37:22 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Feb 20 01:37:22 localhost kernel: hub 1-0:1.0: USB hub found Feb 20 01:37:22 localhost kernel: hub 1-0:1.0: 2 ports detected Feb 20 01:37:22 localhost kernel: usbcore: registered new interface driver usbserial_generic Feb 20 01:37:22 localhost kernel: usbserial: USB Serial support registered for generic Feb 20 01:37:22 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 20 01:37:22 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 20 01:37:22 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 20 01:37:22 localhost kernel: mousedev: PS/2 mouse device common for all mice Feb 20 01:37:22 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Feb 20 01:37:22 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Feb 20 01:37:22 localhost kernel: rtc_cmos 00:04: registered as rtc0 Feb 20 01:37:22 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-20T06:37:21 UTC (1771569441) Feb 20 01:37:22 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Feb 20 01:37:22 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Feb 20 01:37:22 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Feb 20 01:37:22 localhost kernel: usbcore: registered new interface driver usbhid Feb 20 01:37:22 localhost kernel: usbhid: USB HID core driver Feb 20 01:37:22 localhost kernel: drop_monitor: Initializing network drop monitor service Feb 20 01:37:22 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Feb 20 01:37:22 localhost kernel: Initializing XFRM netlink socket Feb 20 01:37:22 localhost kernel: NET: Registered PF_INET6 protocol family Feb 20 01:37:22 localhost kernel: Segment Routing with IPv6 Feb 20 01:37:22 localhost kernel: NET: Registered PF_PACKET protocol family Feb 20 01:37:22 localhost kernel: mpls_gso: MPLS GSO support Feb 20 01:37:22 localhost kernel: IPI shorthand broadcast: enabled Feb 20 01:37:22 localhost kernel: AVX2 version of gcm_enc/dec engaged. Feb 20 01:37:22 localhost kernel: AES CTR mode by8 optimization enabled Feb 20 01:37:22 localhost kernel: sched_clock: Marking stable (775586023, 178347764)->(1078203314, -124269527) Feb 20 01:37:22 localhost kernel: registered taskstats version 1 Feb 20 01:37:22 localhost kernel: Loading compiled-in X.509 certificates Feb 20 01:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Feb 20 01:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Feb 20 01:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Feb 20 01:37:22 localhost kernel: zswap: loaded using pool lzo/zbud Feb 20 01:37:22 localhost kernel: page_owner is disabled Feb 20 01:37:22 localhost kernel: Key type big_key registered Feb 20 01:37:22 localhost kernel: Freeing initrd memory: 74232K Feb 20 01:37:22 localhost kernel: Key type encrypted registered Feb 20 01:37:22 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Feb 20 01:37:22 localhost kernel: Loading compiled-in module X.509 certificates Feb 20 01:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Feb 20 01:37:22 localhost kernel: ima: Allocated hash algorithm: sha256 Feb 20 01:37:22 localhost kernel: ima: No architecture policies found Feb 20 01:37:22 localhost kernel: evm: Initialising EVM extended attributes: Feb 20 01:37:22 localhost kernel: evm: security.selinux Feb 20 01:37:22 localhost kernel: evm: security.SMACK64 (disabled) Feb 20 01:37:22 localhost kernel: evm: security.SMACK64EXEC (disabled) Feb 20 01:37:22 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Feb 20 01:37:22 localhost kernel: evm: security.SMACK64MMAP (disabled) Feb 20 01:37:22 localhost kernel: evm: security.apparmor (disabled) Feb 20 01:37:22 localhost kernel: evm: security.ima Feb 20 01:37:22 localhost kernel: evm: security.capability Feb 20 01:37:22 localhost kernel: evm: HMAC attrs: 0x1 Feb 20 01:37:22 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Feb 20 01:37:22 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Feb 20 01:37:22 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Feb 20 01:37:22 localhost kernel: usb 1-1: Product: QEMU USB Tablet Feb 20 01:37:22 localhost kernel: usb 1-1: Manufacturer: QEMU Feb 20 01:37:22 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Feb 20 01:37:22 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Feb 20 01:37:22 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Feb 20 01:37:22 localhost kernel: Freeing unused decrypted memory: 2036K Feb 20 01:37:22 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Feb 20 01:37:22 localhost kernel: Write protecting the kernel read-only data: 26624k Feb 20 01:37:22 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 20 01:37:22 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Feb 20 01:37:22 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Feb 20 01:37:22 localhost kernel: Run /init as init process Feb 20 01:37:22 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 20 01:37:22 localhost systemd[1]: Detected virtualization kvm. Feb 20 01:37:22 localhost systemd[1]: Detected architecture x86-64. Feb 20 01:37:22 localhost systemd[1]: Running in initrd. Feb 20 01:37:22 localhost systemd[1]: No hostname configured, using default hostname. Feb 20 01:37:22 localhost systemd[1]: Hostname set to . Feb 20 01:37:22 localhost systemd[1]: Initializing machine ID from VM UUID. Feb 20 01:37:22 localhost systemd[1]: Queued start job for default target Initrd Default Target. Feb 20 01:37:22 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Feb 20 01:37:22 localhost systemd[1]: Reached target Local Encrypted Volumes. Feb 20 01:37:22 localhost systemd[1]: Reached target Initrd /usr File System. Feb 20 01:37:22 localhost systemd[1]: Reached target Local File Systems. Feb 20 01:37:22 localhost systemd[1]: Reached target Path Units. Feb 20 01:37:22 localhost systemd[1]: Reached target Slice Units. Feb 20 01:37:22 localhost systemd[1]: Reached target Swaps. Feb 20 01:37:22 localhost systemd[1]: Reached target Timer Units. Feb 20 01:37:22 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Feb 20 01:37:22 localhost systemd[1]: Listening on Journal Socket (/dev/log). Feb 20 01:37:22 localhost systemd[1]: Listening on Journal Socket. Feb 20 01:37:22 localhost systemd[1]: Listening on udev Control Socket. Feb 20 01:37:22 localhost systemd[1]: Listening on udev Kernel Socket. Feb 20 01:37:22 localhost systemd[1]: Reached target Socket Units. Feb 20 01:37:22 localhost systemd[1]: Starting Create List of Static Device Nodes... Feb 20 01:37:22 localhost systemd[1]: Starting Journal Service... Feb 20 01:37:22 localhost systemd[1]: Starting Load Kernel Modules... Feb 20 01:37:22 localhost systemd[1]: Starting Create System Users... Feb 20 01:37:22 localhost systemd[1]: Starting Setup Virtual Console... Feb 20 01:37:22 localhost systemd[1]: Finished Create List of Static Device Nodes. Feb 20 01:37:22 localhost systemd[1]: Finished Load Kernel Modules. Feb 20 01:37:22 localhost systemd-journald[283]: Journal started Feb 20 01:37:22 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/f44a30b3674b4e65a07dfb3d71d4ae11) is 8.0M, max 314.7M, 306.7M free. Feb 20 01:37:22 localhost systemd-modules-load[284]: Module 'msr' is built in Feb 20 01:37:22 localhost systemd[1]: Started Journal Service. Feb 20 01:37:22 localhost systemd[1]: Finished Setup Virtual Console. Feb 20 01:37:22 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Feb 20 01:37:22 localhost systemd[1]: Starting dracut cmdline hook... Feb 20 01:37:22 localhost systemd[1]: Starting Apply Kernel Variables... Feb 20 01:37:22 localhost systemd-sysusers[285]: Creating group 'sgx' with GID 997. Feb 20 01:37:22 localhost systemd-sysusers[285]: Creating group 'users' with GID 100. Feb 20 01:37:22 localhost systemd-sysusers[285]: Creating group 'dbus' with GID 81. Feb 20 01:37:22 localhost systemd-sysusers[285]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Feb 20 01:37:22 localhost systemd[1]: Finished Create System Users. Feb 20 01:37:22 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Feb 20 01:37:22 localhost systemd[1]: Starting Create Volatile Files and Directories... Feb 20 01:37:22 localhost dracut-cmdline[288]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Feb 20 01:37:22 localhost systemd[1]: Finished Apply Kernel Variables. Feb 20 01:37:22 localhost dracut-cmdline[288]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 20 01:37:22 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Feb 20 01:37:22 localhost systemd[1]: Finished Create Volatile Files and Directories. Feb 20 01:37:22 localhost systemd[1]: Finished dracut cmdline hook. Feb 20 01:37:22 localhost systemd[1]: Starting dracut pre-udev hook... Feb 20 01:37:22 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 20 01:37:22 localhost kernel: device-mapper: uevent: version 1.0.3 Feb 20 01:37:22 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Feb 20 01:37:22 localhost kernel: RPC: Registered named UNIX socket transport module. Feb 20 01:37:22 localhost kernel: RPC: Registered udp transport module. Feb 20 01:37:22 localhost kernel: RPC: Registered tcp transport module. Feb 20 01:37:22 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 20 01:37:22 localhost rpc.statd[408]: Version 2.5.4 starting Feb 20 01:37:22 localhost rpc.statd[408]: Initializing NSM state Feb 20 01:37:22 localhost rpc.idmapd[413]: Setting log level to 0 Feb 20 01:37:22 localhost systemd[1]: Finished dracut pre-udev hook. Feb 20 01:37:22 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 20 01:37:22 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'. Feb 20 01:37:22 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 20 01:37:22 localhost systemd[1]: Starting dracut pre-trigger hook... Feb 20 01:37:22 localhost systemd[1]: Finished dracut pre-trigger hook. Feb 20 01:37:22 localhost systemd[1]: Starting Coldplug All udev Devices... Feb 20 01:37:22 localhost systemd[1]: Finished Coldplug All udev Devices. Feb 20 01:37:22 localhost systemd[1]: Reached target System Initialization. Feb 20 01:37:22 localhost systemd[1]: Reached target Basic System. Feb 20 01:37:22 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Feb 20 01:37:22 localhost systemd[1]: Reached target Network. Feb 20 01:37:22 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Feb 20 01:37:22 localhost systemd[1]: Starting dracut initqueue hook... Feb 20 01:37:22 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Feb 20 01:37:22 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 20 01:37:22 localhost kernel: GPT:20971519 != 838860799 Feb 20 01:37:22 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Feb 20 01:37:22 localhost kernel: GPT:20971519 != 838860799 Feb 20 01:37:22 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Feb 20 01:37:22 localhost kernel: vda: vda1 vda2 vda3 vda4 Feb 20 01:37:22 localhost systemd-udevd[453]: Network interface NamePolicy= disabled on kernel command line. Feb 20 01:37:22 localhost kernel: scsi host0: ata_piix Feb 20 01:37:22 localhost kernel: scsi host1: ata_piix Feb 20 01:37:22 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Feb 20 01:37:22 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Feb 20 01:37:22 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Feb 20 01:37:22 localhost systemd[1]: Reached target Initrd Root Device. Feb 20 01:37:22 localhost kernel: ata1: found unknown device (class 0) Feb 20 01:37:22 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Feb 20 01:37:22 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Feb 20 01:37:23 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Feb 20 01:37:23 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Feb 20 01:37:23 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 20 01:37:23 localhost systemd[1]: Finished dracut initqueue hook. Feb 20 01:37:23 localhost systemd[1]: Reached target Preparation for Remote File Systems. Feb 20 01:37:23 localhost systemd[1]: Reached target Remote Encrypted Volumes. Feb 20 01:37:23 localhost systemd[1]: Reached target Remote File Systems. Feb 20 01:37:23 localhost systemd[1]: Starting dracut pre-mount hook... Feb 20 01:37:23 localhost systemd[1]: Finished dracut pre-mount hook. Feb 20 01:37:23 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Feb 20 01:37:23 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system. Feb 20 01:37:23 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Feb 20 01:37:23 localhost systemd[1]: Mounting /sysroot... Feb 20 01:37:23 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Feb 20 01:37:23 localhost kernel: XFS (vda4): Mounting V5 Filesystem Feb 20 01:37:23 localhost kernel: XFS (vda4): Ending clean mount Feb 20 01:37:23 localhost systemd[1]: Mounted /sysroot. Feb 20 01:37:23 localhost systemd[1]: Reached target Initrd Root File System. Feb 20 01:37:23 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Feb 20 01:37:23 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Feb 20 01:37:23 localhost systemd[1]: Reached target Initrd File Systems. Feb 20 01:37:23 localhost systemd[1]: Reached target Initrd Default Target. Feb 20 01:37:23 localhost systemd[1]: Starting dracut mount hook... Feb 20 01:37:23 localhost systemd[1]: Finished dracut mount hook. Feb 20 01:37:23 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Feb 20 01:37:23 localhost rpc.idmapd[413]: exiting on signal 15 Feb 20 01:37:23 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Feb 20 01:37:23 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Feb 20 01:37:23 localhost systemd[1]: Stopped target Network. Feb 20 01:37:23 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Feb 20 01:37:23 localhost systemd[1]: Stopped target Timer Units. Feb 20 01:37:23 localhost systemd[1]: dbus.socket: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Feb 20 01:37:23 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Feb 20 01:37:23 localhost systemd[1]: Stopped target Initrd Default Target. Feb 20 01:37:23 localhost systemd[1]: Stopped target Basic System. Feb 20 01:37:23 localhost systemd[1]: Stopped target Initrd Root Device. Feb 20 01:37:23 localhost systemd[1]: Stopped target Initrd /usr File System. Feb 20 01:37:23 localhost systemd[1]: Stopped target Path Units. Feb 20 01:37:23 localhost systemd[1]: Stopped target Remote File Systems. Feb 20 01:37:23 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Feb 20 01:37:23 localhost systemd[1]: Stopped target Slice Units. Feb 20 01:37:23 localhost systemd[1]: Stopped target Socket Units. Feb 20 01:37:23 localhost systemd[1]: Stopped target System Initialization. Feb 20 01:37:23 localhost systemd[1]: Stopped target Local File Systems. Feb 20 01:37:23 localhost systemd[1]: Stopped target Swaps. Feb 20 01:37:23 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut mount hook. Feb 20 01:37:23 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut pre-mount hook. Feb 20 01:37:23 localhost systemd[1]: Stopped target Local Encrypted Volumes. Feb 20 01:37:23 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Feb 20 01:37:23 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut initqueue hook. Feb 20 01:37:23 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 20 01:37:23 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Load Kernel Modules. Feb 20 01:37:23 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Create Volatile Files and Directories. Feb 20 01:37:23 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Coldplug All udev Devices. Feb 20 01:37:23 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut pre-trigger hook. Feb 20 01:37:23 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Feb 20 01:37:23 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Setup Virtual Console. Feb 20 01:37:23 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Feb 20 01:37:23 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Feb 20 01:37:23 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Closed udev Control Socket. Feb 20 01:37:23 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Closed udev Kernel Socket. Feb 20 01:37:23 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut pre-udev hook. Feb 20 01:37:23 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped dracut cmdline hook. Feb 20 01:37:23 localhost systemd[1]: Starting Cleanup udev Database... Feb 20 01:37:23 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Feb 20 01:37:23 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Create List of Static Device Nodes. Feb 20 01:37:23 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Stopped Create System Users. Feb 20 01:37:23 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 20 01:37:23 localhost systemd[1]: Finished Cleanup udev Database. Feb 20 01:37:23 localhost systemd[1]: Reached target Switch Root. Feb 20 01:37:23 localhost systemd[1]: Starting Switch Root... Feb 20 01:37:23 localhost systemd[1]: Switching root. Feb 20 01:37:23 localhost systemd-journald[283]: Journal stopped Feb 20 01:37:24 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd). Feb 20 01:37:24 localhost kernel: audit: type=1404 audit(1771569444.052:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Feb 20 01:37:24 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 01:37:24 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 01:37:24 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 01:37:24 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 01:37:24 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 01:37:24 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 01:37:24 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 01:37:24 localhost kernel: audit: type=1403 audit(1771569444.185:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 20 01:37:24 localhost systemd[1]: Successfully loaded SELinux policy in 136.805ms. Feb 20 01:37:24 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 35.458ms. Feb 20 01:37:24 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 20 01:37:24 localhost systemd[1]: Detected virtualization kvm. Feb 20 01:37:24 localhost systemd[1]: Detected architecture x86-64. Feb 20 01:37:24 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 01:37:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 01:37:24 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 20 01:37:24 localhost systemd[1]: Stopped Switch Root. Feb 20 01:37:24 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 20 01:37:24 localhost systemd[1]: Created slice Slice /system/getty. Feb 20 01:37:24 localhost systemd[1]: Created slice Slice /system/modprobe. Feb 20 01:37:24 localhost systemd[1]: Created slice Slice /system/serial-getty. Feb 20 01:37:24 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Feb 20 01:37:24 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Feb 20 01:37:24 localhost systemd[1]: Created slice User and Session Slice. Feb 20 01:37:24 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Feb 20 01:37:24 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Feb 20 01:37:24 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Feb 20 01:37:24 localhost systemd[1]: Reached target Local Encrypted Volumes. Feb 20 01:37:24 localhost systemd[1]: Stopped target Switch Root. Feb 20 01:37:24 localhost systemd[1]: Stopped target Initrd File Systems. Feb 20 01:37:24 localhost systemd[1]: Stopped target Initrd Root File System. Feb 20 01:37:24 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Feb 20 01:37:24 localhost systemd[1]: Reached target Path Units. Feb 20 01:37:24 localhost systemd[1]: Reached target rpc_pipefs.target. Feb 20 01:37:24 localhost systemd[1]: Reached target Slice Units. Feb 20 01:37:24 localhost systemd[1]: Reached target Swaps. Feb 20 01:37:24 localhost systemd[1]: Reached target Local Verity Protected Volumes. Feb 20 01:37:24 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Feb 20 01:37:24 localhost systemd[1]: Reached target RPC Port Mapper. Feb 20 01:37:24 localhost systemd[1]: Listening on Process Core Dump Socket. Feb 20 01:37:24 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Feb 20 01:37:24 localhost systemd[1]: Listening on udev Control Socket. Feb 20 01:37:24 localhost systemd[1]: Listening on udev Kernel Socket. Feb 20 01:37:24 localhost systemd[1]: Mounting Huge Pages File System... Feb 20 01:37:24 localhost systemd[1]: Mounting POSIX Message Queue File System... Feb 20 01:37:24 localhost systemd[1]: Mounting Kernel Debug File System... Feb 20 01:37:24 localhost systemd[1]: Mounting Kernel Trace File System... Feb 20 01:37:24 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Feb 20 01:37:24 localhost systemd[1]: Starting Create List of Static Device Nodes... Feb 20 01:37:24 localhost systemd[1]: Starting Load Kernel Module configfs... Feb 20 01:37:24 localhost systemd[1]: Starting Load Kernel Module drm... Feb 20 01:37:24 localhost systemd[1]: Starting Load Kernel Module fuse... Feb 20 01:37:24 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Feb 20 01:37:24 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 20 01:37:24 localhost systemd[1]: Stopped File System Check on Root Device. Feb 20 01:37:24 localhost systemd[1]: Stopped Journal Service. Feb 20 01:37:24 localhost systemd[1]: Starting Journal Service... Feb 20 01:37:24 localhost systemd[1]: Starting Load Kernel Modules... Feb 20 01:37:24 localhost systemd[1]: Starting Generate network units from Kernel command line... Feb 20 01:37:24 localhost kernel: ACPI: bus type drm_connector registered Feb 20 01:37:24 localhost kernel: fuse: init (API version 7.36) Feb 20 01:37:24 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Feb 20 01:37:24 localhost systemd-journald[618]: Journal started Feb 20 01:37:24 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/01f46965e72fd8a157841feaa66c8d52) is 8.0M, max 314.7M, 306.7M free. Feb 20 01:37:24 localhost systemd[1]: Queued start job for default target Multi-User System. Feb 20 01:37:24 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Feb 20 01:37:24 localhost systemd-modules-load[619]: Module 'msr' is built in Feb 20 01:37:24 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Feb 20 01:37:24 localhost systemd[1]: Starting Coldplug All udev Devices... Feb 20 01:37:24 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Feb 20 01:37:24 localhost systemd[1]: Started Journal Service. Feb 20 01:37:24 localhost systemd[1]: Mounted Huge Pages File System. Feb 20 01:37:24 localhost systemd[1]: Mounted POSIX Message Queue File System. Feb 20 01:37:24 localhost systemd[1]: Mounted Kernel Debug File System. Feb 20 01:37:24 localhost systemd[1]: Mounted Kernel Trace File System. Feb 20 01:37:24 localhost systemd[1]: Finished Create List of Static Device Nodes. Feb 20 01:37:24 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 20 01:37:24 localhost systemd[1]: Finished Load Kernel Module configfs. Feb 20 01:37:24 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 20 01:37:24 localhost systemd[1]: Finished Load Kernel Module drm. Feb 20 01:37:24 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 20 01:37:24 localhost systemd[1]: Finished Load Kernel Module fuse. Feb 20 01:37:24 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Feb 20 01:37:24 localhost systemd[1]: Finished Load Kernel Modules. Feb 20 01:37:24 localhost systemd[1]: Finished Generate network units from Kernel command line. Feb 20 01:37:24 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Feb 20 01:37:24 localhost systemd[1]: Mounting FUSE Control File System... Feb 20 01:37:24 localhost systemd[1]: Mounting Kernel Configuration File System... Feb 20 01:37:24 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Feb 20 01:37:24 localhost systemd[1]: Starting Rebuild Hardware Database... Feb 20 01:37:24 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Feb 20 01:37:24 localhost systemd[1]: Starting Load/Save Random Seed... Feb 20 01:37:24 localhost systemd[1]: Starting Apply Kernel Variables... Feb 20 01:37:24 localhost systemd[1]: Starting Create System Users... Feb 20 01:37:24 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/01f46965e72fd8a157841feaa66c8d52) is 8.0M, max 314.7M, 306.7M free. Feb 20 01:37:24 localhost systemd-journald[618]: Received client request to flush runtime journal. Feb 20 01:37:24 localhost systemd[1]: Mounted FUSE Control File System. Feb 20 01:37:24 localhost systemd[1]: Mounted Kernel Configuration File System. Feb 20 01:37:24 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Feb 20 01:37:24 localhost systemd[1]: Finished Load/Save Random Seed. Feb 20 01:37:24 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Feb 20 01:37:24 localhost systemd[1]: Finished Apply Kernel Variables. Feb 20 01:37:24 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989. Feb 20 01:37:24 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988. Feb 20 01:37:24 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Feb 20 01:37:24 localhost systemd[1]: Finished Coldplug All udev Devices. Feb 20 01:37:24 localhost systemd[1]: Finished Create System Users. Feb 20 01:37:24 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Feb 20 01:37:25 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Feb 20 01:37:25 localhost systemd[1]: Reached target Preparation for Local File Systems. Feb 20 01:37:25 localhost systemd[1]: Set up automount EFI System Partition Automount. Feb 20 01:37:25 localhost systemd[1]: Finished Rebuild Hardware Database. Feb 20 01:37:25 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 20 01:37:25 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Feb 20 01:37:25 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 20 01:37:25 localhost systemd[1]: Starting Load Kernel Module configfs... Feb 20 01:37:25 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 20 01:37:25 localhost systemd[1]: Finished Load Kernel Module configfs. Feb 20 01:37:25 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Feb 20 01:37:25 localhost systemd-udevd[648]: Network interface NamePolicy= disabled on kernel command line. Feb 20 01:37:25 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Feb 20 01:37:25 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Feb 20 01:37:25 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Feb 20 01:37:25 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Feb 20 01:37:25 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Feb 20 01:37:25 localhost systemd[1]: Mounting /boot... Feb 20 01:37:25 localhost kernel: XFS (vda3): Mounting V5 Filesystem Feb 20 01:37:25 localhost systemd-fsck[687]: fsck.fat 4.2 (2021-01-31) Feb 20 01:37:25 localhost systemd-fsck[687]: /dev/vda2: 12 files, 1782/51145 clusters Feb 20 01:37:25 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Feb 20 01:37:25 localhost kernel: XFS (vda3): Ending clean mount Feb 20 01:37:25 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Feb 20 01:37:25 localhost systemd[1]: Mounted /boot. Feb 20 01:37:25 localhost kernel: SVM: TSC scaling supported Feb 20 01:37:25 localhost kernel: kvm: Nested Virtualization enabled Feb 20 01:37:25 localhost kernel: SVM: kvm: Nested Paging enabled Feb 20 01:37:25 localhost kernel: SVM: LBR virtualization supported Feb 20 01:37:25 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Feb 20 01:37:25 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Feb 20 01:37:25 localhost kernel: Console: switching to colour dummy device 80x25 Feb 20 01:37:25 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Feb 20 01:37:25 localhost kernel: [drm] features: -context_init Feb 20 01:37:25 localhost kernel: [drm] number of scanouts: 1 Feb 20 01:37:25 localhost kernel: [drm] number of cap sets: 0 Feb 20 01:37:25 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Feb 20 01:37:25 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Feb 20 01:37:25 localhost kernel: Console: switching to colour frame buffer device 128x48 Feb 20 01:37:25 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Feb 20 01:37:25 localhost systemd[1]: Mounting /boot/efi... Feb 20 01:37:25 localhost systemd[1]: Mounted /boot/efi. Feb 20 01:37:25 localhost systemd[1]: Reached target Local File Systems. Feb 20 01:37:25 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Feb 20 01:37:25 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Feb 20 01:37:25 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 20 01:37:25 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 20 01:37:25 localhost systemd[1]: Starting Automatic Boot Loader Update... Feb 20 01:37:25 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Feb 20 01:37:25 localhost systemd[1]: Starting Create Volatile Files and Directories... Feb 20 01:37:25 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 713 (bootctl) Feb 20 01:37:25 localhost systemd[1]: Starting File System Check on /dev/vda2... Feb 20 01:37:25 localhost systemd[1]: Finished File System Check on /dev/vda2. Feb 20 01:37:26 localhost systemd[1]: Mounting EFI System Partition Automount... Feb 20 01:37:26 localhost systemd[1]: Mounted EFI System Partition Automount. Feb 20 01:37:26 localhost systemd[1]: Finished Automatic Boot Loader Update. Feb 20 01:37:26 localhost systemd[1]: Finished Create Volatile Files and Directories. Feb 20 01:37:26 localhost systemd[1]: Starting Security Auditing Service... Feb 20 01:37:26 localhost systemd[1]: Starting RPC Bind... Feb 20 01:37:26 localhost systemd[1]: Starting Rebuild Journal Catalog... Feb 20 01:37:26 localhost systemd[1]: Finished Rebuild Journal Catalog. Feb 20 01:37:26 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Feb 20 01:37:26 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable) Feb 20 01:37:26 localhost systemd[1]: Started RPC Bind. Feb 20 01:37:26 localhost augenrules[730]: /sbin/augenrules: No change Feb 20 01:37:26 localhost augenrules[740]: No rules Feb 20 01:37:26 localhost augenrules[740]: enabled 1 Feb 20 01:37:26 localhost augenrules[740]: failure 1 Feb 20 01:37:26 localhost augenrules[740]: pid 725 Feb 20 01:37:26 localhost augenrules[740]: rate_limit 0 Feb 20 01:37:26 localhost augenrules[740]: backlog_limit 8192 Feb 20 01:37:26 localhost augenrules[740]: lost 0 Feb 20 01:37:26 localhost augenrules[740]: backlog 3 Feb 20 01:37:26 localhost augenrules[740]: backlog_wait_time 60000 Feb 20 01:37:26 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 20 01:37:26 localhost augenrules[740]: enabled 1 Feb 20 01:37:26 localhost augenrules[740]: failure 1 Feb 20 01:37:26 localhost augenrules[740]: pid 725 Feb 20 01:37:26 localhost augenrules[740]: rate_limit 0 Feb 20 01:37:26 localhost augenrules[740]: backlog_limit 8192 Feb 20 01:37:26 localhost augenrules[740]: lost 0 Feb 20 01:37:26 localhost augenrules[740]: backlog 0 Feb 20 01:37:26 localhost augenrules[740]: backlog_wait_time 60000 Feb 20 01:37:26 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 20 01:37:26 localhost augenrules[740]: enabled 1 Feb 20 01:37:26 localhost augenrules[740]: failure 1 Feb 20 01:37:26 localhost augenrules[740]: pid 725 Feb 20 01:37:26 localhost augenrules[740]: rate_limit 0 Feb 20 01:37:26 localhost augenrules[740]: backlog_limit 8192 Feb 20 01:37:26 localhost augenrules[740]: lost 0 Feb 20 01:37:26 localhost augenrules[740]: backlog 4 Feb 20 01:37:26 localhost augenrules[740]: backlog_wait_time 60000 Feb 20 01:37:26 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 20 01:37:26 localhost systemd[1]: Started Security Auditing Service. Feb 20 01:37:26 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Feb 20 01:37:26 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Feb 20 01:37:26 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Feb 20 01:37:26 localhost systemd[1]: Starting Update is Completed... Feb 20 01:37:26 localhost systemd[1]: Finished Update is Completed. Feb 20 01:37:26 localhost systemd[1]: Reached target System Initialization. Feb 20 01:37:26 localhost systemd[1]: Started dnf makecache --timer. Feb 20 01:37:26 localhost systemd[1]: Started Daily rotation of log files. Feb 20 01:37:26 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Feb 20 01:37:26 localhost systemd[1]: Reached target Timer Units. Feb 20 01:37:26 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Feb 20 01:37:26 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Feb 20 01:37:26 localhost systemd[1]: Reached target Socket Units. Feb 20 01:37:26 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Feb 20 01:37:26 localhost systemd[1]: Starting D-Bus System Message Bus... Feb 20 01:37:26 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 20 01:37:26 localhost systemd[1]: Started D-Bus System Message Bus. Feb 20 01:37:26 localhost systemd[1]: Reached target Basic System. Feb 20 01:37:26 localhost systemd[1]: Starting NTP client/server... Feb 20 01:37:26 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Feb 20 01:37:26 localhost journal[750]: Ready Feb 20 01:37:26 localhost systemd[1]: Started irqbalance daemon. Feb 20 01:37:26 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Feb 20 01:37:26 localhost systemd[1]: Starting System Logging Service... Feb 20 01:37:26 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 01:37:26 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 01:37:26 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 01:37:26 localhost systemd[1]: Reached target sshd-keygen.target. Feb 20 01:37:26 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Feb 20 01:37:26 localhost systemd[1]: Reached target User and Group Name Lookups. Feb 20 01:37:26 localhost systemd[1]: Starting User Login Management... Feb 20 01:37:26 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start Feb 20 01:37:26 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Feb 20 01:37:26 localhost systemd[1]: Started System Logging Service. Feb 20 01:37:26 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Feb 20 01:37:26 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 20 01:37:26 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data Feb 20 01:37:26 localhost chronyd[765]: Loaded seccomp filter (level 2) Feb 20 01:37:26 localhost systemd[1]: Started NTP client/server. Feb 20 01:37:26 localhost systemd-logind[759]: New seat seat0. Feb 20 01:37:26 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button) Feb 20 01:37:26 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Feb 20 01:37:26 localhost systemd[1]: Started User Login Management. Feb 20 01:37:26 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 01:37:26 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Fri, 20 Feb 2026 06:37:26 +0000. Up 6.12 seconds. Feb 20 01:37:27 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpjpj1ezo2.mount: Deactivated successfully. Feb 20 01:37:27 localhost systemd[1]: Starting Hostname Service... Feb 20 01:37:27 localhost systemd[1]: Started Hostname Service. Feb 20 01:37:27 localhost systemd-hostnamed[783]: Hostname set to (static) Feb 20 01:37:27 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Feb 20 01:37:27 localhost systemd[1]: Reached target Preparation for Network. Feb 20 01:37:27 localhost systemd[1]: Starting Network Manager... Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4523] NetworkManager (version 1.42.2-1.el9) is starting... (boot:de550921-08e7-4d93-b7d2-b745d62af5c6) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4530] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4589] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Feb 20 01:37:27 localhost systemd[1]: Started Network Manager. Feb 20 01:37:27 localhost systemd[1]: Reached target Network. Feb 20 01:37:27 localhost systemd[1]: Starting Network Manager Wait Online... Feb 20 01:37:27 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4667] manager[0x55f0caa47020]: monitoring kernel firmware directory '/lib/firmware'. Feb 20 01:37:27 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Feb 20 01:37:27 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4716] hostname: hostname: using hostnamed Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4717] hostname: static hostname changed from (none) to "np0005625204.novalocal" Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4730] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Feb 20 01:37:27 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Feb 20 01:37:27 localhost systemd[1]: Started GSSAPI Proxy Daemon. Feb 20 01:37:27 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Feb 20 01:37:27 localhost systemd[1]: Reached target NFS client services. Feb 20 01:37:27 localhost systemd[1]: Reached target Preparation for Remote File Systems. Feb 20 01:37:27 localhost systemd[1]: Reached target Remote File Systems. Feb 20 01:37:27 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4888] manager[0x55f0caa47020]: rfkill: Wi-Fi hardware radio set enabled Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4889] manager[0x55f0caa47020]: rfkill: WWAN hardware radio set enabled Feb 20 01:37:27 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4952] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4953] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4966] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.4969] manager: Networking is enabled by state file Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5010] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5012] settings: Loaded settings plugin: keyfile (internal) Feb 20 01:37:27 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5042] dhcp: init: Using DHCP client 'internal' Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5044] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5057] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5061] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5069] device (lo): Activation: starting connection 'lo' (b35a86af-6461-4196-bb8b-daceaa528560) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5077] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5080] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5114] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5117] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5119] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5120] device (eth0): carrier: link connected Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5122] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5126] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5131] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5135] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5136] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5138] manager: NetworkManager state is now CONNECTING Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5140] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5155] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5157] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5188] dhcp4 (eth0): state changed new lease, address=38.102.83.80 Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5191] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Feb 20 01:37:27 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5212] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5271] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5274] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5280] device (lo): Activation: successful, device activated. Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5286] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5288] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5291] manager: NetworkManager state is now CONNECTED_SITE Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5295] device (eth0): Activation: successful, device activated. Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5300] manager: NetworkManager state is now CONNECTED_GLOBAL Feb 20 01:37:27 localhost NetworkManager[788]: [1771569447.5305] manager: startup complete Feb 20 01:37:27 localhost systemd[1]: Finished Network Manager Wait Online. Feb 20 01:37:27 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Feb 20 01:37:27 localhost cloud-init[936]: Cloud-init v. 22.1-9.el9 running 'init' at Fri, 20 Feb 2026 06:37:27 +0000. Up 6.98 seconds. Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | eth0 | True | 38.102.83.80 | 255.255.255.0 | global | fa:16:3e:f0:29:e2 | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | eth0 | True | fe80::f816:3eff:fef0:29e2/64 | . | link | fa:16:3e:f0:29:e2 | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | lo | True | ::1/128 | . | host | . | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | Route | Destination | Gateway | Interface | Flags | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: | 3 | multicast | :: | eth0 | U | Feb 20 01:37:27 localhost cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 20 01:37:27 localhost systemd[1]: Starting Authorization Manager... Feb 20 01:37:28 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 20 01:37:28 localhost polkitd[1035]: Started polkitd version 0.117 Feb 20 01:37:28 localhost systemd[1]: Started Authorization Manager. Feb 20 01:37:31 localhost cloud-init[936]: Generating public/private rsa key pair. Feb 20 01:37:31 localhost cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Feb 20 01:37:31 localhost cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Feb 20 01:37:31 localhost cloud-init[936]: The key fingerprint is: Feb 20 01:37:31 localhost cloud-init[936]: SHA256:E91G8EdNxLVmgA2Vm1JmrWN5rMaiGWlWyNAnQy6MrH8 root@np0005625204.novalocal Feb 20 01:37:31 localhost cloud-init[936]: The key's randomart image is: Feb 20 01:37:31 localhost cloud-init[936]: +---[RSA 3072]----+ Feb 20 01:37:31 localhost cloud-init[936]: | o..o*o+==| Feb 20 01:37:31 localhost cloud-init[936]: | . o..= * B..+| Feb 20 01:37:31 localhost cloud-init[936]: | o o+.* B B+ | Feb 20 01:37:31 localhost cloud-init[936]: | . .+ + Ooo | Feb 20 01:37:31 localhost cloud-init[936]: | . S o + + | Feb 20 01:37:31 localhost cloud-init[936]: | . * . + | Feb 20 01:37:31 localhost cloud-init[936]: | . Eo + o | Feb 20 01:37:31 localhost cloud-init[936]: | . o | Feb 20 01:37:31 localhost cloud-init[936]: | | Feb 20 01:37:31 localhost cloud-init[936]: +----[SHA256]-----+ Feb 20 01:37:31 localhost cloud-init[936]: Generating public/private ecdsa key pair. Feb 20 01:37:31 localhost cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Feb 20 01:37:31 localhost cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Feb 20 01:37:31 localhost cloud-init[936]: The key fingerprint is: Feb 20 01:37:31 localhost cloud-init[936]: SHA256:aQlEOhbipyPb2qhzcpToy+ulg+XetNgyfycgUOryKOw root@np0005625204.novalocal Feb 20 01:37:31 localhost cloud-init[936]: The key's randomart image is: Feb 20 01:37:31 localhost cloud-init[936]: +---[ECDSA 256]---+ Feb 20 01:37:31 localhost cloud-init[936]: | . ..o | Feb 20 01:37:31 localhost cloud-init[936]: | ... + | Feb 20 01:37:31 localhost cloud-init[936]: | o. = . | Feb 20 01:37:31 localhost cloud-init[936]: |o + . . o | Feb 20 01:37:31 localhost cloud-init[936]: |ooo. S | Feb 20 01:37:31 localhost cloud-init[936]: |o==.. . | Feb 20 01:37:31 localhost cloud-init[936]: |**.o.. | Feb 20 01:37:31 localhost cloud-init[936]: |BBX= .o . | Feb 20 01:37:31 localhost cloud-init[936]: |OEB+=. o | Feb 20 01:37:31 localhost cloud-init[936]: +----[SHA256]-----+ Feb 20 01:37:31 localhost cloud-init[936]: Generating public/private ed25519 key pair. Feb 20 01:37:31 localhost cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Feb 20 01:37:31 localhost cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Feb 20 01:37:31 localhost cloud-init[936]: The key fingerprint is: Feb 20 01:37:31 localhost cloud-init[936]: SHA256:/ksuBpRMcIozXwWEz9Nvt9HqjKrIKBmCAauXD8R0p04 root@np0005625204.novalocal Feb 20 01:37:31 localhost cloud-init[936]: The key's randomart image is: Feb 20 01:37:31 localhost cloud-init[936]: +--[ED25519 256]--+ Feb 20 01:37:31 localhost cloud-init[936]: | .++.. | Feb 20 01:37:31 localhost cloud-init[936]: |. ..ooo. | Feb 20 01:37:31 localhost cloud-init[936]: |.++..B.o | Feb 20 01:37:31 localhost cloud-init[936]: |o o+E.B . | Feb 20 01:37:31 localhost cloud-init[936]: |oo +.. .S. . | Feb 20 01:37:31 localhost cloud-init[936]: |= + . .. o o . | Feb 20 01:37:31 localhost cloud-init[936]: |.+ o ..... + | Feb 20 01:37:31 localhost cloud-init[936]: |o o.. o+ oo | Feb 20 01:37:31 localhost cloud-init[936]: | .. o .o.o=oo | Feb 20 01:37:31 localhost cloud-init[936]: +----[SHA256]-----+ Feb 20 01:37:31 localhost sm-notify[1131]: Version 2.5.4 starting Feb 20 01:37:31 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Feb 20 01:37:31 localhost systemd[1]: Reached target Cloud-config availability. Feb 20 01:37:31 localhost systemd[1]: Reached target Network is Online. Feb 20 01:37:31 localhost sshd[1140]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Feb 20 01:37:31 localhost sshd[1132]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Feb 20 01:37:31 localhost systemd[1]: Starting Crash recovery kernel arming... Feb 20 01:37:31 localhost systemd[1]: Starting Notify NFS peers of a restart... Feb 20 01:37:31 localhost systemd[1]: Starting OpenSSH server daemon... Feb 20 01:37:31 localhost systemd[1]: Starting Permit User Sessions... Feb 20 01:37:31 localhost systemd[1]: Started Notify NFS peers of a restart. Feb 20 01:37:31 localhost systemd[1]: Finished Permit User Sessions. Feb 20 01:37:31 localhost systemd[1]: Started OpenSSH server daemon. Feb 20 01:37:31 localhost systemd[1]: Started Command Scheduler. Feb 20 01:37:31 localhost systemd[1]: Started Getty on tty1. Feb 20 01:37:31 localhost systemd[1]: Started Serial Getty on ttyS0. Feb 20 01:37:31 localhost systemd[1]: Reached target Login Prompts. Feb 20 01:37:31 localhost systemd[1]: Reached target Multi-User System. Feb 20 01:37:31 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Feb 20 01:37:31 localhost sshd[1154]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 20 01:37:31 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Feb 20 01:37:31 localhost sshd[1171]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost sshd[1184]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost kdumpctl[1134]: kdump: No kdump initial ramdisk found. Feb 20 01:37:31 localhost kdumpctl[1134]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Feb 20 01:37:31 localhost sshd[1195]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost sshd[1204]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost sshd[1218]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost sshd[1236]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost cloud-init[1274]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Fri, 20 Feb 2026 06:37:31 +0000. Up 11.01 seconds. Feb 20 01:37:31 localhost sshd[1268]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:37:31 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Feb 20 01:37:31 localhost systemd[1]: Starting Execute cloud user/final scripts... Feb 20 01:37:32 localhost dracut[1437]: dracut-057-21.git20230214.el9 Feb 20 01:37:32 localhost cloud-init[1438]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Fri, 20 Feb 2026 06:37:32 +0000. Up 11.35 seconds. Feb 20 01:37:32 localhost cloud-init[1455]: ############################################################# Feb 20 01:37:32 localhost cloud-init[1456]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Feb 20 01:37:32 localhost cloud-init[1460]: 256 SHA256:aQlEOhbipyPb2qhzcpToy+ulg+XetNgyfycgUOryKOw root@np0005625204.novalocal (ECDSA) Feb 20 01:37:32 localhost cloud-init[1465]: 256 SHA256:/ksuBpRMcIozXwWEz9Nvt9HqjKrIKBmCAauXD8R0p04 root@np0005625204.novalocal (ED25519) Feb 20 01:37:32 localhost dracut[1440]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Feb 20 01:37:32 localhost cloud-init[1475]: 3072 SHA256:E91G8EdNxLVmgA2Vm1JmrWN5rMaiGWlWyNAnQy6MrH8 root@np0005625204.novalocal (RSA) Feb 20 01:37:32 localhost cloud-init[1477]: -----END SSH HOST KEY FINGERPRINTS----- Feb 20 01:37:32 localhost cloud-init[1480]: ############################################################# Feb 20 01:37:32 localhost cloud-init[1438]: Cloud-init v. 22.1-9.el9 finished at Fri, 20 Feb 2026 06:37:32 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 11.60 seconds Feb 20 01:37:32 localhost systemd[1]: Reloading Network Manager... Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Feb 20 01:37:32 localhost NetworkManager[788]: [1771569452.5003] audit: op="reload" arg="0" pid=1577 uid=0 result="success" Feb 20 01:37:32 localhost NetworkManager[788]: [1771569452.5013] config: signal: SIGHUP (no changes from disk) Feb 20 01:37:32 localhost systemd[1]: Reloaded Network Manager. Feb 20 01:37:32 localhost systemd[1]: Finished Execute cloud user/final scripts. Feb 20 01:37:32 localhost systemd[1]: Reached target Cloud-init target. Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Feb 20 01:37:32 localhost chronyd[765]: Selected source 199.182.221.110 (2.rhel.pool.ntp.org) Feb 20 01:37:32 localhost chronyd[765]: System clock TAI offset set to 37 seconds Feb 20 01:37:32 localhost dracut[1440]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Feb 20 01:37:32 localhost dracut[1440]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Feb 20 01:37:32 localhost dracut[1440]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Feb 20 01:37:33 localhost dracut[1440]: memstrack is not available Feb 20 01:37:33 localhost dracut[1440]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Feb 20 01:37:33 localhost dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Feb 20 01:37:33 localhost dracut[1440]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Feb 20 01:37:33 localhost dracut[1440]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Feb 20 01:37:33 localhost dracut[1440]: memstrack is not available Feb 20 01:37:33 localhost dracut[1440]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Feb 20 01:37:33 localhost dracut[1440]: *** Including module: systemd *** Feb 20 01:37:33 localhost dracut[1440]: *** Including module: systemd-initrd *** Feb 20 01:37:33 localhost dracut[1440]: *** Including module: i18n *** Feb 20 01:37:33 localhost dracut[1440]: No KEYMAP configured. Feb 20 01:37:33 localhost dracut[1440]: *** Including module: drm *** Feb 20 01:37:34 localhost chronyd[765]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org) Feb 20 01:37:34 localhost dracut[1440]: *** Including module: prefixdevname *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: kernel-modules *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: kernel-modules-extra *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: qemu *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: fstab-sys *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: rootfs-block *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: terminfo *** Feb 20 01:37:34 localhost dracut[1440]: *** Including module: udev-rules *** Feb 20 01:37:35 localhost dracut[1440]: Skipping udev rule: 91-permissions.rules Feb 20 01:37:35 localhost dracut[1440]: Skipping udev rule: 80-drivers-modprobe.rules Feb 20 01:37:35 localhost dracut[1440]: *** Including module: virtiofs *** Feb 20 01:37:35 localhost dracut[1440]: *** Including module: dracut-systemd *** Feb 20 01:37:35 localhost dracut[1440]: *** Including module: usrmount *** Feb 20 01:37:35 localhost dracut[1440]: *** Including module: base *** Feb 20 01:37:35 localhost dracut[1440]: *** Including module: fs-lib *** Feb 20 01:37:35 localhost dracut[1440]: *** Including module: kdumpbase *** Feb 20 01:37:35 localhost dracut[1440]: *** Including module: microcode_ctl-fw_dir_override *** Feb 20 01:37:35 localhost dracut[1440]: microcode_ctl module: mangling fw_dir Feb 20 01:37:35 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-2d-07" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-4e-03" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-4f-01" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-55-04" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-5e-03" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-8c-01" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Feb 20 01:37:36 localhost dracut[1440]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Feb 20 01:37:36 localhost dracut[1440]: *** Including module: shutdown *** Feb 20 01:37:36 localhost dracut[1440]: *** Including module: squash *** Feb 20 01:37:36 localhost dracut[1440]: *** Including modules done *** Feb 20 01:37:36 localhost dracut[1440]: *** Installing kernel module dependencies *** Feb 20 01:37:36 localhost dracut[1440]: *** Installing kernel module dependencies done *** Feb 20 01:37:36 localhost dracut[1440]: *** Resolving executable dependencies *** Feb 20 01:37:37 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 20 01:37:38 localhost dracut[1440]: *** Resolving executable dependencies done *** Feb 20 01:37:38 localhost dracut[1440]: *** Hardlinking files *** Feb 20 01:37:38 localhost dracut[1440]: Mode: real Feb 20 01:37:38 localhost dracut[1440]: Files: 1099 Feb 20 01:37:38 localhost dracut[1440]: Linked: 3 files Feb 20 01:37:38 localhost dracut[1440]: Compared: 0 xattrs Feb 20 01:37:38 localhost dracut[1440]: Compared: 373 files Feb 20 01:37:38 localhost dracut[1440]: Saved: 61.04 KiB Feb 20 01:37:38 localhost dracut[1440]: Duration: 0.035561 seconds Feb 20 01:37:38 localhost dracut[1440]: *** Hardlinking files done *** Feb 20 01:37:38 localhost dracut[1440]: Could not find 'strip'. Not stripping the initramfs. Feb 20 01:37:38 localhost dracut[1440]: *** Generating early-microcode cpio image *** Feb 20 01:37:38 localhost dracut[1440]: *** Constructing AuthenticAMD.bin *** Feb 20 01:37:38 localhost dracut[1440]: *** Store current command line parameters *** Feb 20 01:37:38 localhost dracut[1440]: Stored kernel commandline: Feb 20 01:37:38 localhost dracut[1440]: No dracut internal kernel commandline stored in the initramfs Feb 20 01:37:38 localhost dracut[1440]: *** Install squash loader *** Feb 20 01:37:39 localhost dracut[1440]: *** Squashing the files inside the initramfs *** Feb 20 01:37:40 localhost dracut[1440]: *** Squashing the files inside the initramfs done *** Feb 20 01:37:40 localhost dracut[1440]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Feb 20 01:37:40 localhost dracut[1440]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Feb 20 01:37:40 localhost kdumpctl[1134]: kdump: kexec: loaded kdump kernel Feb 20 01:37:40 localhost kdumpctl[1134]: kdump: Starting kdump: [OK] Feb 20 01:37:40 localhost systemd[1]: Finished Crash recovery kernel arming. Feb 20 01:37:40 localhost systemd[1]: Startup finished in 1.214s (kernel) + 2.066s (initrd) + 16.639s (userspace) = 19.920s. Feb 20 01:37:57 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 20 01:38:17 localhost sshd[4176]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:38:17 localhost systemd[1]: Created slice User Slice of UID 1000. Feb 20 01:38:17 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Feb 20 01:38:17 localhost systemd-logind[759]: New session 1 of user zuul. Feb 20 01:38:17 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Feb 20 01:38:17 localhost systemd[1]: Starting User Manager for UID 1000... Feb 20 01:38:17 localhost systemd[4180]: Queued start job for default target Main User Target. Feb 20 01:38:17 localhost systemd[4180]: Created slice User Application Slice. Feb 20 01:38:17 localhost systemd[4180]: Started Mark boot as successful after the user session has run 2 minutes. Feb 20 01:38:17 localhost systemd[4180]: Started Daily Cleanup of User's Temporary Directories. Feb 20 01:38:17 localhost systemd[4180]: Reached target Paths. Feb 20 01:38:17 localhost systemd[4180]: Reached target Timers. Feb 20 01:38:17 localhost systemd[4180]: Starting D-Bus User Message Bus Socket... Feb 20 01:38:17 localhost systemd[4180]: Starting Create User's Volatile Files and Directories... Feb 20 01:38:17 localhost systemd[4180]: Finished Create User's Volatile Files and Directories. Feb 20 01:38:17 localhost systemd[4180]: Listening on D-Bus User Message Bus Socket. Feb 20 01:38:17 localhost systemd[4180]: Reached target Sockets. Feb 20 01:38:17 localhost systemd[4180]: Reached target Basic System. Feb 20 01:38:17 localhost systemd[4180]: Reached target Main User Target. Feb 20 01:38:17 localhost systemd[4180]: Startup finished in 114ms. Feb 20 01:38:17 localhost systemd[1]: Started User Manager for UID 1000. Feb 20 01:38:17 localhost systemd[1]: Started Session 1 of User zuul. Feb 20 01:38:18 localhost python3[4232]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 01:38:27 localhost python3[4250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 01:38:33 localhost python3[4303]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 01:38:34 localhost python3[4333]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Feb 20 01:38:37 localhost python3[4349]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:38:38 localhost python3[4363]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:39 localhost python3[4422]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:38:40 localhost python3[4463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771569519.4242065-396-28194802573997/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa follow=False checksum=1ede725f5cdca64ff103c7e62f7bb7b42f0b9244 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:41 localhost python3[4536]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:38:41 localhost python3[4577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771569521.188387-494-117951666179636/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa.pub follow=False checksum=d5896bb6dcd221ffe99ce3acccb68a5152af8369 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:43 localhost python3[4605]: ansible-ping Invoked with data=pong Feb 20 01:38:45 localhost python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 01:38:49 localhost python3[4673]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Feb 20 01:38:51 localhost python3[4695]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:51 localhost python3[4709]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:52 localhost python3[4723]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:53 localhost python3[4737]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:53 localhost python3[4751]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:53 localhost python3[4765]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:56 localhost python3[4781]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:38:58 localhost python3[4829]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:38:58 localhost python3[4872]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771569537.8461342-104-85037354026452/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:06 localhost python3[4900]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:06 localhost python3[4914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:06 localhost python3[4928]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:06 localhost python3[4942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:07 localhost python3[4956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:07 localhost python3[4970]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:07 localhost python3[4984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:07 localhost python3[4998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:08 localhost python3[5012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:08 localhost python3[5026]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:08 localhost python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:09 localhost python3[5054]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:09 localhost python3[5068]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:09 localhost python3[5082]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:09 localhost python3[5096]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:10 localhost python3[5110]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:10 localhost python3[5124]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:10 localhost python3[5138]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:11 localhost python3[5152]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:11 localhost python3[5166]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:11 localhost python3[5180]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:11 localhost python3[5194]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:12 localhost python3[5208]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:12 localhost python3[5222]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:12 localhost python3[5236]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:12 localhost python3[5250]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:39:13 localhost python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Feb 20 01:39:13 localhost systemd[1]: Starting Time & Date Service... Feb 20 01:39:13 localhost systemd[1]: Started Time & Date Service. Feb 20 01:39:14 localhost systemd-timedated[5268]: Changed time zone to 'UTC' (UTC). Feb 20 01:39:14 localhost python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:15 localhost python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:39:16 localhost python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771569555.625111-497-82497591994618/source _original_basename=tmp78_25nce follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:17 localhost python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:39:17 localhost python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771569557.1648777-588-22594415562177/source _original_basename=tmpc1tibo3d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:19 localhost python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:39:19 localhost python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771569559.2069945-732-235973663411204/source _original_basename=tmpyjse3d76 follow=False checksum=d65fea983e4ac4bc5449bcd3fb3aadcab86a1db0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:20 localhost python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:39:21 localhost python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:39:22 localhost python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:39:22 localhost python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771569562.132321-859-54902510195330/source _original_basename=tmp8m97n_kz follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:23 localhost python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-ff2a-a63c-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:39:25 localhost python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-ff2a-a63c-000000000024-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Feb 20 01:39:26 localhost python3[5785]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:39:44 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 20 01:39:46 localhost python3[5803]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:40:46 localhost systemd-logind[759]: Session 1 logged out. Waiting for processes to exit. Feb 20 01:40:56 localhost systemd[4180]: Starting Mark boot as successful... Feb 20 01:40:56 localhost systemd[4180]: Finished Mark boot as successful. Feb 20 01:41:27 localhost systemd[1]: Unmounting EFI System Partition Automount... Feb 20 01:41:27 localhost systemd[1]: efi.mount: Deactivated successfully. Feb 20 01:41:27 localhost systemd[1]: Unmounted EFI System Partition Automount. Feb 20 01:43:31 localhost sshd[5810]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:35 localhost sshd[5812]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:40 localhost sshd[5814]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:43 localhost sshd[5816]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:46 localhost sshd[5818]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Feb 20 01:43:47 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Feb 20 01:43:47 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2012] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Feb 20 01:43:47 localhost systemd-udevd[5821]: Network interface NamePolicy= disabled on kernel command line. Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2156] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2191] settings: (eth1): created default wired connection 'Wired connection 1' Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2196] device (eth1): carrier: link connected Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2200] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2207] policy: auto-activating connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8) Feb 20 01:43:47 localhost systemd[4180]: Created slice User Background Tasks Slice. Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2219] device (eth1): Activation: starting connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8) Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2222] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2229] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2236] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Feb 20 01:43:47 localhost NetworkManager[788]: [1771569827.2242] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 20 01:43:47 localhost systemd[4180]: Starting Cleanup of User's Temporary Files and Directories... Feb 20 01:43:47 localhost systemd[4180]: Finished Cleanup of User's Temporary Files and Directories. Feb 20 01:43:47 localhost sshd[5824]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:47 localhost systemd-logind[759]: New session 3 of user zuul. Feb 20 01:43:47 localhost systemd[1]: Started Session 3 of User zuul. Feb 20 01:43:48 localhost python3[5841]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-fb18-e746-000000000475-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:43:48 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Feb 20 01:43:50 localhost sshd[5844]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:54 localhost sshd[5846]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:43:57 localhost sshd[5848]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:01 localhost python3[5897]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:44:01 localhost python3[5940]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771569841.1756814-537-139979349408384/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=3a3f85cb9d3a049d92e27c50f152d5abef64c350 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:44:02 localhost sshd[5971]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:02 localhost python3[5970]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 01:44:02 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Feb 20 01:44:02 localhost systemd[1]: Stopped Network Manager Wait Online. Feb 20 01:44:02 localhost systemd[1]: Stopping Network Manager Wait Online... Feb 20 01:44:02 localhost systemd[1]: Stopping Network Manager... Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.4880] caught SIGTERM, shutting down normally. Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5048] dhcp4 (eth0): canceled DHCP transaction Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5049] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5049] dhcp4 (eth0): state changed no lease Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5053] manager: NetworkManager state is now CONNECTING Feb 20 01:44:02 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5148] dhcp4 (eth1): canceled DHCP transaction Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5149] dhcp4 (eth1): state changed no lease Feb 20 01:44:02 localhost NetworkManager[788]: [1771569842.5215] exiting (success) Feb 20 01:44:02 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 20 01:44:02 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Feb 20 01:44:02 localhost systemd[1]: Stopped Network Manager. Feb 20 01:44:02 localhost systemd[1]: NetworkManager.service: Consumed 2.215s CPU time. Feb 20 01:44:02 localhost systemd[1]: Starting Network Manager... Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.5859] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:de550921-08e7-4d93-b7d2-b745d62af5c6) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.5862] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.5888] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Feb 20 01:44:02 localhost systemd[1]: Started Network Manager. Feb 20 01:44:02 localhost systemd[1]: Starting Network Manager Wait Online... Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.5950] manager[0x55adec924090]: monitoring kernel firmware directory '/lib/firmware'. Feb 20 01:44:02 localhost systemd[1]: Starting Hostname Service... Feb 20 01:44:02 localhost systemd[1]: Started Hostname Service. Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6671] hostname: hostname: using hostnamed Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6672] hostname: static hostname changed from (none) to "np0005625204.novalocal" Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6680] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6687] manager[0x55adec924090]: rfkill: Wi-Fi hardware radio set enabled Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6688] manager[0x55adec924090]: rfkill: WWAN hardware radio set enabled Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6728] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6729] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6729] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6730] manager: Networking is enabled by state file Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6746] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6747] settings: Loaded settings plugin: keyfile (internal) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6799] dhcp: init: Using DHCP client 'internal' Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6802] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6809] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6816] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6827] device (lo): Activation: starting connection 'lo' (b35a86af-6461-4196-bb8b-daceaa528560) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6835] device (eth0): carrier: link connected Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6841] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6846] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6847] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6855] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6864] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6871] device (eth1): carrier: link connected Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6877] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6884] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8) (indicated) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6884] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6890] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6899] device (eth1): Activation: starting connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6922] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6925] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6928] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6930] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6934] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6936] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6940] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6943] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6949] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6952] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6964] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.6966] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7003] dhcp4 (eth0): state changed new lease, address=38.102.83.80 Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7012] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7130] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7135] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7141] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7146] device (lo): Activation: successful, device activated. Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7186] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7188] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7193] manager: NetworkManager state is now CONNECTED_SITE Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7196] device (eth0): Activation: successful, device activated. Feb 20 01:44:02 localhost NetworkManager[5988]: [1771569842.7201] manager: NetworkManager state is now CONNECTED_GLOBAL Feb 20 01:44:02 localhost python3[6042]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-fb18-e746-000000000136-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:44:06 localhost sshd[6057]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:09 localhost sshd[6059]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:12 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 20 01:44:14 localhost sshd[6061]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:17 localhost sshd[6063]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:20 localhost sshd[6065]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:24 localhost sshd[6067]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:28 localhost sshd[6069]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:31 localhost sshd[6071]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:32 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 20 01:44:34 localhost sshd[6076]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:38 localhost sshd[6078]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:41 localhost sshd[6080]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:46 localhost sshd[6082]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:47 localhost NetworkManager[5988]: [1771569887.7621] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:47 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 20 01:44:47 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 20 01:44:47 localhost NetworkManager[5988]: [1771569887.7860] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:47 localhost NetworkManager[5988]: [1771569887.7863] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Feb 20 01:44:47 localhost NetworkManager[5988]: [1771569887.7873] device (eth1): Activation: successful, device activated. Feb 20 01:44:47 localhost NetworkManager[5988]: [1771569887.7881] manager: startup complete Feb 20 01:44:47 localhost systemd[1]: Finished Network Manager Wait Online. Feb 20 01:44:52 localhost sshd[6097]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:56 localhost sshd[6099]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:44:57 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 20 01:44:59 localhost sshd[6101]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:03 localhost systemd[1]: session-3.scope: Deactivated successfully. Feb 20 01:45:03 localhost systemd[1]: session-3.scope: Consumed 1.537s CPU time. Feb 20 01:45:03 localhost systemd-logind[759]: Session 3 logged out. Waiting for processes to exit. Feb 20 01:45:03 localhost systemd-logind[759]: Removed session 3. Feb 20 01:45:03 localhost sshd[6103]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:06 localhost sshd[6105]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:10 localhost sshd[6107]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:15 localhost sshd[6109]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:18 localhost sshd[6111]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:22 localhost sshd[6113]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:25 localhost sshd[6115]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:25 localhost systemd-logind[759]: New session 4 of user zuul. Feb 20 01:45:25 localhost systemd[1]: Started Session 4 of User zuul. Feb 20 01:45:25 localhost python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:45:25 localhost python3[6209]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771569925.270666-628-183704899478326/source _original_basename=tmpc235pqzn follow=False checksum=1adafc0c3cabf5458281c7d741082eddefa40194 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:45:26 localhost sshd[6224]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:28 localhost systemd[1]: session-4.scope: Deactivated successfully. Feb 20 01:45:28 localhost systemd-logind[759]: Session 4 logged out. Waiting for processes to exit. Feb 20 01:45:28 localhost systemd-logind[759]: Removed session 4. Feb 20 01:45:30 localhost sshd[6226]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:45:33 localhost sshd[6228]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:52:27 localhost sshd[6233]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:52:27 localhost systemd[1]: Starting Cleanup of Temporary Directories... Feb 20 01:52:27 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Feb 20 01:52:27 localhost systemd[1]: Finished Cleanup of Temporary Directories. Feb 20 01:52:27 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Feb 20 01:52:27 localhost systemd-logind[759]: New session 5 of user zuul. Feb 20 01:52:27 localhost systemd[1]: Started Session 5 of User zuul. Feb 20 01:52:27 localhost python3[6256]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-12ca-cc53-00000000219f-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:52:29 localhost python3[6275]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:52:29 localhost python3[6291]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:52:29 localhost python3[6307]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:52:30 localhost python3[6323]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:52:30 localhost python3[6339]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:52:32 localhost python3[6387]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:52:32 localhost python3[6430]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771570351.7884274-667-182459782895875/source _original_basename=tmp_jn8vaqc follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:52:34 localhost python3[6460]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 01:52:34 localhost systemd[1]: Reloading. Feb 20 01:52:34 localhost systemd-rc-local-generator[6481]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 01:52:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 01:52:35 localhost python3[6507]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Feb 20 01:52:36 localhost python3[6523]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:52:37 localhost python3[6541]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:52:37 localhost python3[6559]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:52:37 localhost python3[6577]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:52:48 localhost python3[6594]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-12ca-cc53-0000000021a6-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:52:49 localhost python3[6614]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 01:52:52 localhost systemd[1]: session-5.scope: Deactivated successfully. Feb 20 01:52:52 localhost systemd[1]: session-5.scope: Consumed 4.053s CPU time. Feb 20 01:52:52 localhost systemd-logind[759]: Session 5 logged out. Waiting for processes to exit. Feb 20 01:52:52 localhost systemd-logind[759]: Removed session 5. Feb 20 01:53:53 localhost sshd[6620]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:53:53 localhost systemd-logind[759]: New session 6 of user zuul. Feb 20 01:53:53 localhost systemd[1]: Started Session 6 of User zuul. Feb 20 01:53:54 localhost systemd[1]: Starting RHSM dbus service... Feb 20 01:53:54 localhost systemd[1]: Started RHSM dbus service. Feb 20 01:53:54 localhost rhsm-service[6644]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 20 01:53:54 localhost rhsm-service[6644]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 20 01:53:54 localhost rhsm-service[6644]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 20 01:53:54 localhost rhsm-service[6644]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 20 01:53:56 localhost rhsm-service[6644]: INFO [subscription_manager.managerlib:90] Consumer created: np0005625204.novalocal (430a9023-94d5-4ff5-8ad4-f0155783873a) Feb 20 01:53:56 localhost subscription-manager[6644]: Registered system with identity: 430a9023-94d5-4ff5-8ad4-f0155783873a Feb 20 01:53:56 localhost rhsm-service[6644]: INFO [subscription_manager.entcertlib:131] certs updated: Feb 20 01:53:56 localhost rhsm-service[6644]: Total updates: 1 Feb 20 01:53:56 localhost rhsm-service[6644]: Found (local) serial# [] Feb 20 01:53:56 localhost rhsm-service[6644]: Expected (UEP) serial# [5588354398145753591] Feb 20 01:53:56 localhost rhsm-service[6644]: Added (new) Feb 20 01:53:56 localhost rhsm-service[6644]: [sn:5588354398145753591 ( Content Access,) @ /etc/pki/entitlement/5588354398145753591.pem] Feb 20 01:53:56 localhost rhsm-service[6644]: Deleted (rogue): Feb 20 01:53:56 localhost rhsm-service[6644]: Feb 20 01:53:56 localhost subscription-manager[6644]: Added subscription for 'Content Access' contract 'None' Feb 20 01:53:56 localhost subscription-manager[6644]: Added subscription for product ' Content Access' Feb 20 01:53:57 localhost rhsm-service[6644]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 20 01:53:57 localhost rhsm-service[6644]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 20 01:53:57 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:53:58 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:53:58 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:53:58 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:53:58 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:54:00 localhost python3[6735]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-d2eb-5884-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 01:55:00 localhost systemd[1]: session-6.scope: Deactivated successfully. Feb 20 01:55:00 localhost systemd[1]: session-6.scope: Consumed 1.567s CPU time. Feb 20 01:55:00 localhost systemd-logind[759]: Session 6 logged out. Waiting for processes to exit. Feb 20 01:55:00 localhost systemd-logind[759]: Removed session 6. Feb 20 01:55:03 localhost sshd[6740]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:55:03 localhost systemd-logind[759]: New session 7 of user zuul. Feb 20 01:55:03 localhost systemd[1]: Started Session 7 of User zuul. Feb 20 01:55:03 localhost python3[6759]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 01:55:09 localhost sshd[6766]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:55:28 localhost sshd[6779]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:55:32 localhost setsebool[6836]: The virt_use_nfs policy boolean was changed to 1 by root Feb 20 01:55:32 localhost setsebool[6836]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Feb 20 01:55:41 localhost kernel: SELinux: Converting 406 SID table entries... Feb 20 01:55:41 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 01:55:41 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 01:55:41 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 01:55:41 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 01:55:41 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 01:55:41 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 01:55:41 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 01:55:53 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=3 res=1 Feb 20 01:55:53 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 01:55:53 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 01:55:53 localhost systemd[1]: Reloading. Feb 20 01:55:53 localhost systemd-rc-local-generator[7693]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 01:55:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 01:55:53 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 01:55:55 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:55:55 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 01:56:00 localhost podman[16246]: 2026-02-20 06:56:00.28808241 +0000 UTC m=+0.103332790 system refresh Feb 20 01:56:01 localhost systemd[4180]: Starting D-Bus User Message Bus... Feb 20 01:56:01 localhost dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Feb 20 01:56:01 localhost dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Feb 20 01:56:01 localhost systemd[4180]: Started D-Bus User Message Bus. Feb 20 01:56:01 localhost journal[17399]: Ready Feb 20 01:56:01 localhost systemd[4180]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Feb 20 01:56:01 localhost systemd[4180]: Created slice Slice /user. Feb 20 01:56:01 localhost systemd[4180]: podman-17269.scope: unit configures an IP firewall, but not running as root. Feb 20 01:56:01 localhost systemd[4180]: (This warning is only shown for the first unit using IP firewalling.) Feb 20 01:56:01 localhost systemd[4180]: Started podman-17269.scope. Feb 20 01:56:01 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 01:56:01 localhost systemd[4180]: Started podman-pause-6d03a9a2.scope. Feb 20 01:56:02 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 01:56:02 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 01:56:02 localhost systemd[1]: man-db-cache-update.service: Consumed 10.323s CPU time. Feb 20 01:56:02 localhost systemd[1]: run-r45b5324098c246c3ae3dacad90a0c586.service: Deactivated successfully. Feb 20 01:56:03 localhost systemd[1]: session-7.scope: Deactivated successfully. Feb 20 01:56:03 localhost systemd[1]: session-7.scope: Consumed 48.546s CPU time. Feb 20 01:56:03 localhost systemd-logind[759]: Session 7 logged out. Waiting for processes to exit. Feb 20 01:56:03 localhost systemd-logind[759]: Removed session 7. Feb 20 01:56:19 localhost sshd[18493]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:56:19 localhost sshd[18492]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:56:19 localhost sshd[18494]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:56:19 localhost sshd[18496]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:56:19 localhost sshd[18495]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:56:23 localhost sshd[18502]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:56:23 localhost systemd-logind[759]: New session 8 of user zuul. Feb 20 01:56:23 localhost systemd[1]: Started Session 8 of User zuul. Feb 20 01:56:23 localhost python3[18519]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHF6ws6TTGIgpcynk+zfDmAiKAngdz4qTSYI5OZYL/Nj9dQsVH9D0sSlKxQpeRN7puQyuA81owKWTQGJzf43DRQ= zuul@np0005625196.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:56:24 localhost python3[18535]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHF6ws6TTGIgpcynk+zfDmAiKAngdz4qTSYI5OZYL/Nj9dQsVH9D0sSlKxQpeRN7puQyuA81owKWTQGJzf43DRQ= zuul@np0005625196.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:56:26 localhost systemd[1]: session-8.scope: Deactivated successfully. Feb 20 01:56:26 localhost systemd-logind[759]: Session 8 logged out. Waiting for processes to exit. Feb 20 01:56:26 localhost systemd-logind[759]: Removed session 8. Feb 20 01:57:45 localhost sshd[18538]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:57:45 localhost systemd-logind[759]: New session 9 of user zuul. Feb 20 01:57:45 localhost systemd[1]: Started Session 9 of User zuul. Feb 20 01:57:45 localhost python3[18557]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 01:57:46 localhost python3[18573]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 20 01:57:48 localhost python3[18623]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:57:48 localhost python3[18666]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771570667.8219144-139-8684751804930/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa follow=False checksum=1ede725f5cdca64ff103c7e62f7bb7b42f0b9244 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:57:49 localhost python3[18728]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:57:50 localhost python3[18771]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771570669.45577-228-268540852689489/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa.pub follow=False checksum=d5896bb6dcd221ffe99ce3acccb68a5152af8369 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:57:52 localhost python3[18801]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:57:53 localhost python3[18847]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:57:53 localhost python3[18863]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmp40v7jvle recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:57:54 localhost python3[18923]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:57:54 localhost python3[18939]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmppwxjlel3 recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:57:56 localhost python3[18999]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 01:57:56 localhost python3[19015]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpffh2s3k2 recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 01:57:57 localhost systemd[1]: session-9.scope: Deactivated successfully. Feb 20 01:57:57 localhost systemd[1]: session-9.scope: Consumed 3.587s CPU time. Feb 20 01:57:57 localhost systemd-logind[759]: Session 9 logged out. Waiting for processes to exit. Feb 20 01:57:57 localhost systemd-logind[759]: Removed session 9. Feb 20 01:58:12 localhost sshd[19030]: main: sshd: ssh-rsa algorithm is disabled Feb 20 01:58:12 localhost sshd[19031]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:00:07 localhost sshd[19033]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:00:07 localhost systemd-logind[759]: New session 10 of user zuul. Feb 20 02:00:07 localhost systemd[1]: Started Session 10 of User zuul. Feb 20 02:00:07 localhost python3[19079]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:05:06 localhost systemd[1]: session-10.scope: Deactivated successfully. Feb 20 02:05:06 localhost systemd-logind[759]: Session 10 logged out. Waiting for processes to exit. Feb 20 02:05:06 localhost systemd-logind[759]: Removed session 10. Feb 20 02:06:08 localhost sshd[19098]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:07:29 localhost sshd[19100]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:08:06 localhost sshd[19102]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:08:50 localhost sshd[19105]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:11:23 localhost sshd[19109]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:11:24 localhost systemd-logind[759]: New session 11 of user zuul. Feb 20 02:11:24 localhost systemd[1]: Started Session 11 of User zuul. Feb 20 02:11:24 localhost python3[19126]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-064b-165c-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:11:27 localhost python3[19146]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-064b-165c-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:11:58 localhost python3[19165]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Feb 20 02:12:01 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:12:01 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:12:30 localhost python3[19322]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Feb 20 02:12:33 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:12:50 localhost python3[19462]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Feb 20 02:12:52 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:12:52 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:12:54 localhost sshd[19589]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:12:57 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:04 localhost sshd[19784]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:11 localhost sshd[19786]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:18 localhost python3[19803]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Feb 20 02:13:18 localhost sshd[19805]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:21 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:21 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:25 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:26 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:26 localhost sshd[20116]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:27 localhost sshd[20120]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:31 localhost sshd[20189]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:34 localhost sshd[20191]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:37 localhost sshd[20193]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:45 localhost sshd[20195]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:46 localhost python3[20211]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Feb 20 02:13:49 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:51 localhost sshd[20397]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:13:54 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:13:54 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:14:00 localhost sshd[20535]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:04 localhost python3[20553]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000013-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:14:08 localhost sshd[20557]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:13 localhost sshd[20559]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:22 localhost sshd[20561]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:28 localhost sshd[20563]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:32 localhost python3[20580]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:14:36 localhost sshd[20594]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:41 localhost sshd[20635]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:47 localhost sshd[20675]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:51 localhost sshd[20680]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:51 localhost kernel: SELinux: Converting 486 SID table entries... Feb 20 02:14:51 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:14:51 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:14:51 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:14:51 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:14:51 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:14:51 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:14:51 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:14:52 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=4 res=1 Feb 20 02:14:52 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Feb 20 02:14:55 localhost sshd[21011]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:14:55 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:14:55 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 02:14:55 localhost systemd[1]: Reloading. Feb 20 02:14:55 localhost systemd-rc-local-generator[21240]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:14:55 localhost systemd-sysv-generator[21245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:14:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:14:56 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 02:14:56 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 02:14:56 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 02:14:56 localhost systemd[1]: run-rb9ad198008414985965620ce5def172c.service: Deactivated successfully. Feb 20 02:14:57 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:14:57 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:15:02 localhost sshd[21848]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:06 localhost sshd[21850]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:12 localhost python3[21867]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000015-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:15:12 localhost sshd[21871]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:16 localhost sshd[21873]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:23 localhost sshd[21875]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:32 localhost sshd[21878]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:34 localhost python3[21895]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:15:36 localhost python3[21943]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:15:36 localhost python3[21986]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771571735.942986-332-69464575789040/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=91bc45728dd9738fc644e3ada9d8642294da29ff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:15:37 localhost sshd[22001]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:38 localhost python3[22018]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 20 02:15:38 localhost systemd-journald[618]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Feb 20 02:15:38 localhost systemd-journald[618]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 02:15:38 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 02:15:38 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 02:15:38 localhost python3[22039]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 20 02:15:38 localhost sshd[22055]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:39 localhost python3[22060]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 20 02:15:39 localhost python3[22081]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 20 02:15:39 localhost python3[22101]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 20 02:15:41 localhost sshd[22106]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:42 localhost python3[22123]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:15:43 localhost systemd[1]: Starting LSB: Bring up/down networking... Feb 20 02:15:43 localhost network[22126]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 02:15:43 localhost network[22137]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 02:15:43 localhost network[22126]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Feb 20 02:15:43 localhost network[22138]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:15:43 localhost network[22126]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Feb 20 02:15:43 localhost network[22139]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 02:15:43 localhost NetworkManager[5988]: [1771571743.5758] audit: op="connections-reload" pid=22167 uid=0 result="success" Feb 20 02:15:43 localhost network[22126]: Bringing up loopback interface: [ OK ] Feb 20 02:15:43 localhost NetworkManager[5988]: [1771571743.7763] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22255 uid=0 result="success" Feb 20 02:15:43 localhost network[22126]: Bringing up interface eth0: [ OK ] Feb 20 02:15:43 localhost systemd[1]: Started LSB: Bring up/down networking. Feb 20 02:15:44 localhost python3[22296]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:15:44 localhost sshd[22298]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:44 localhost systemd[1]: Starting Open vSwitch Database Unit... Feb 20 02:15:44 localhost chown[22301]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Feb 20 02:15:44 localhost ovs-ctl[22306]: /etc/openvswitch/conf.db does not exist ... (warning). Feb 20 02:15:44 localhost ovs-ctl[22306]: Creating empty database /etc/openvswitch/conf.db [ OK ] Feb 20 02:15:44 localhost ovs-ctl[22306]: Starting ovsdb-server [ OK ] Feb 20 02:15:44 localhost ovs-vsctl[22356]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Feb 20 02:15:44 localhost ovs-vsctl[22376]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"e6b84e4d-7dff-4c2c-96db-c41e3ef520c6\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Feb 20 02:15:44 localhost ovs-ctl[22306]: Configuring Open vSwitch system IDs [ OK ] Feb 20 02:15:44 localhost ovs-vsctl[22382]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005625204.novalocal Feb 20 02:15:44 localhost ovs-ctl[22306]: Enabling remote OVSDB managers [ OK ] Feb 20 02:15:44 localhost systemd[1]: Started Open vSwitch Database Unit. Feb 20 02:15:44 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Feb 20 02:15:44 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Feb 20 02:15:44 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Feb 20 02:15:44 localhost kernel: openvswitch: Open vSwitch switching datapath Feb 20 02:15:44 localhost ovs-ctl[22426]: Inserting openvswitch module [ OK ] Feb 20 02:15:44 localhost ovs-ctl[22395]: Starting ovs-vswitchd [ OK ] Feb 20 02:15:44 localhost ovs-vsctl[22445]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005625204.novalocal Feb 20 02:15:44 localhost ovs-ctl[22395]: Enabling remote OVSDB managers [ OK ] Feb 20 02:15:44 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Feb 20 02:15:44 localhost systemd[1]: Starting Open vSwitch... Feb 20 02:15:44 localhost systemd[1]: Finished Open vSwitch. Feb 20 02:15:47 localhost sshd[22449]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:15:58 localhost sshd[22451]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:16:03 localhost sshd[22453]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:16:11 localhost sshd[22455]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:16:15 localhost python3[22472]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-00000000001a-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:16:16 localhost NetworkManager[5988]: [1771571776.3677] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22669 uid=0 result="success" Feb 20 02:16:16 localhost ifup[22670]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:16 localhost ifup[22671]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:16 localhost ifup[22672]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:16 localhost NetworkManager[5988]: [1771571776.4002] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22678 uid=0 result="success" Feb 20 02:16:16 localhost ovs-vsctl[22680]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:ba:18:b1 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Feb 20 02:16:16 localhost kernel: device ovs-system entered promiscuous mode Feb 20 02:16:16 localhost NetworkManager[5988]: [1771571776.4262] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Feb 20 02:16:16 localhost systemd-udevd[22681]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:16 localhost kernel: Timeout policy base is empty Feb 20 02:16:16 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Feb 20 02:16:16 localhost kernel: device br-ex entered promiscuous mode Feb 20 02:16:16 localhost systemd-udevd[22696]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:16 localhost NetworkManager[5988]: [1771571776.4715] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Feb 20 02:16:16 localhost NetworkManager[5988]: [1771571776.5021] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22706 uid=0 result="success" Feb 20 02:16:16 localhost NetworkManager[5988]: [1771571776.5230] device (br-ex): carrier: link connected Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.5795] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22735 uid=0 result="success" Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.6278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22750 uid=0 result="success" Feb 20 02:16:19 localhost NET[22775]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.7095] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.7148] dhcp4 (eth1): canceled DHCP transaction Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.7148] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.7149] dhcp4 (eth1): state changed no lease Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.7188] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22784 uid=0 result="success" Feb 20 02:16:19 localhost ifup[22785]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:19 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 20 02:16:19 localhost ifup[22786]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:19 localhost ifup[22788]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:19 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.7550] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22801 uid=0 result="success" Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.8046] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22812 uid=0 result="success" Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.8139] device (eth1): carrier: link connected Feb 20 02:16:19 localhost NetworkManager[5988]: [1771571779.8346] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22821 uid=0 result="success" Feb 20 02:16:19 localhost ipv6_wait_tentative[22833]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Feb 20 02:16:20 localhost ipv6_wait_tentative[22838]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Feb 20 02:16:21 localhost NetworkManager[5988]: [1771571781.9060] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22847 uid=0 result="success" Feb 20 02:16:21 localhost ovs-vsctl[22862]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Feb 20 02:16:21 localhost kernel: device eth1 entered promiscuous mode Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.0041] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22870 uid=0 result="success" Feb 20 02:16:22 localhost ifup[22871]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:22 localhost ifup[22872]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:22 localhost ifup[22873]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.0353] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22879 uid=0 result="success" Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.0769] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22889 uid=0 result="success" Feb 20 02:16:22 localhost ifup[22890]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:22 localhost ifup[22891]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:22 localhost ifup[22892]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.1071] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22898 uid=0 result="success" Feb 20 02:16:22 localhost ovs-vsctl[22901]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Feb 20 02:16:22 localhost kernel: device vlan21 entered promiscuous mode Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.1473] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Feb 20 02:16:22 localhost systemd-udevd[22903]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.1670] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22912 uid=0 result="success" Feb 20 02:16:22 localhost NetworkManager[5988]: [1771571782.1863] device (vlan21): carrier: link connected Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.2467] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22941 uid=0 result="success" Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.2918] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22956 uid=0 result="success" Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.3499] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22977 uid=0 result="success" Feb 20 02:16:25 localhost ifup[22978]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:25 localhost ifup[22979]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:25 localhost ifup[22980]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.3826] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22986 uid=0 result="success" Feb 20 02:16:25 localhost ovs-vsctl[22989]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Feb 20 02:16:25 localhost kernel: device vlan44 entered promiscuous mode Feb 20 02:16:25 localhost systemd-udevd[22991]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.4231] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.4491] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23001 uid=0 result="success" Feb 20 02:16:25 localhost NetworkManager[5988]: [1771571785.4731] device (vlan44): carrier: link connected Feb 20 02:16:27 localhost sshd[23022]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.5345] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23033 uid=0 result="success" Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.5802] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23048 uid=0 result="success" Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.6401] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23069 uid=0 result="success" Feb 20 02:16:28 localhost ifup[23070]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:28 localhost ifup[23071]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:28 localhost ifup[23072]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.6719] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23078 uid=0 result="success" Feb 20 02:16:28 localhost ovs-vsctl[23081]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Feb 20 02:16:28 localhost kernel: device vlan20 entered promiscuous mode Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.7132] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Feb 20 02:16:28 localhost systemd-udevd[23083]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.7409] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23093 uid=0 result="success" Feb 20 02:16:28 localhost NetworkManager[5988]: [1771571788.7619] device (vlan20): carrier: link connected Feb 20 02:16:29 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 20 02:16:31 localhost NetworkManager[5988]: [1771571791.8210] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23123 uid=0 result="success" Feb 20 02:16:31 localhost NetworkManager[5988]: [1771571791.8693] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23138 uid=0 result="success" Feb 20 02:16:31 localhost NetworkManager[5988]: [1771571791.9266] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23159 uid=0 result="success" Feb 20 02:16:31 localhost ifup[23160]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:31 localhost ifup[23161]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:31 localhost ifup[23162]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:31 localhost NetworkManager[5988]: [1771571791.9584] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23168 uid=0 result="success" Feb 20 02:16:32 localhost ovs-vsctl[23171]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Feb 20 02:16:32 localhost kernel: device vlan22 entered promiscuous mode Feb 20 02:16:32 localhost systemd-udevd[23173]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:32 localhost NetworkManager[5988]: [1771571792.0446] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Feb 20 02:16:32 localhost NetworkManager[5988]: [1771571792.0705] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23183 uid=0 result="success" Feb 20 02:16:32 localhost NetworkManager[5988]: [1771571792.0902] device (vlan22): carrier: link connected Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.1387] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23213 uid=0 result="success" Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.1840] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23228 uid=0 result="success" Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.2380] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23249 uid=0 result="success" Feb 20 02:16:35 localhost ifup[23250]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:35 localhost ifup[23251]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:35 localhost ifup[23252]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.2632] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23258 uid=0 result="success" Feb 20 02:16:35 localhost ovs-vsctl[23261]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Feb 20 02:16:35 localhost systemd-udevd[23263]: Network interface NamePolicy= disabled on kernel command line. Feb 20 02:16:35 localhost kernel: device vlan23 entered promiscuous mode Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.3027] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.3273] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23273 uid=0 result="success" Feb 20 02:16:35 localhost NetworkManager[5988]: [1771571795.3477] device (vlan23): carrier: link connected Feb 20 02:16:38 localhost NetworkManager[5988]: [1771571798.4005] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23303 uid=0 result="success" Feb 20 02:16:38 localhost NetworkManager[5988]: [1771571798.4453] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23318 uid=0 result="success" Feb 20 02:16:38 localhost NetworkManager[5988]: [1771571798.5049] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23339 uid=0 result="success" Feb 20 02:16:38 localhost ifup[23340]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:38 localhost ifup[23341]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:38 localhost ifup[23342]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:38 localhost NetworkManager[5988]: [1771571798.5355] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23348 uid=0 result="success" Feb 20 02:16:38 localhost ovs-vsctl[23351]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Feb 20 02:16:38 localhost NetworkManager[5988]: [1771571798.5915] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23358 uid=0 result="success" Feb 20 02:16:39 localhost NetworkManager[5988]: [1771571799.6741] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23385 uid=0 result="success" Feb 20 02:16:39 localhost NetworkManager[5988]: [1771571799.7196] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23400 uid=0 result="success" Feb 20 02:16:39 localhost NetworkManager[5988]: [1771571799.7682] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23421 uid=0 result="success" Feb 20 02:16:39 localhost ifup[23422]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:39 localhost ifup[23423]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:39 localhost ifup[23424]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:39 localhost NetworkManager[5988]: [1771571799.7965] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23430 uid=0 result="success" Feb 20 02:16:39 localhost ovs-vsctl[23433]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Feb 20 02:16:39 localhost NetworkManager[5988]: [1771571799.8493] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23440 uid=0 result="success" Feb 20 02:16:40 localhost NetworkManager[5988]: [1771571800.9080] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23468 uid=0 result="success" Feb 20 02:16:40 localhost NetworkManager[5988]: [1771571800.9504] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23483 uid=0 result="success" Feb 20 02:16:41 localhost NetworkManager[5988]: [1771571801.0095] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23504 uid=0 result="success" Feb 20 02:16:41 localhost ifup[23505]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:41 localhost ifup[23506]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:41 localhost ifup[23507]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:41 localhost NetworkManager[5988]: [1771571801.0408] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23513 uid=0 result="success" Feb 20 02:16:41 localhost ovs-vsctl[23516]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Feb 20 02:16:41 localhost NetworkManager[5988]: [1771571801.0989] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23523 uid=0 result="success" Feb 20 02:16:41 localhost sshd[23542]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:16:42 localhost NetworkManager[5988]: [1771571802.1596] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23552 uid=0 result="success" Feb 20 02:16:42 localhost NetworkManager[5988]: [1771571802.2088] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23568 uid=0 result="success" Feb 20 02:16:42 localhost NetworkManager[5988]: [1771571802.2703] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23589 uid=0 result="success" Feb 20 02:16:42 localhost ifup[23590]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:42 localhost ifup[23591]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:42 localhost ifup[23592]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:42 localhost NetworkManager[5988]: [1771571802.3019] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23598 uid=0 result="success" Feb 20 02:16:42 localhost ovs-vsctl[23601]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Feb 20 02:16:42 localhost NetworkManager[5988]: [1771571802.3586] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23608 uid=0 result="success" Feb 20 02:16:43 localhost NetworkManager[5988]: [1771571803.4158] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23636 uid=0 result="success" Feb 20 02:16:43 localhost NetworkManager[5988]: [1771571803.4631] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23651 uid=0 result="success" Feb 20 02:16:43 localhost NetworkManager[5988]: [1771571803.5242] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23672 uid=0 result="success" Feb 20 02:16:43 localhost ifup[23673]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 20 02:16:43 localhost ifup[23674]: 'network-scripts' will be removed from distribution in near future. Feb 20 02:16:43 localhost ifup[23675]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 20 02:16:43 localhost NetworkManager[5988]: [1771571803.5554] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23681 uid=0 result="success" Feb 20 02:16:43 localhost ovs-vsctl[23684]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Feb 20 02:16:43 localhost NetworkManager[5988]: [1771571803.6123] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23691 uid=0 result="success" Feb 20 02:16:44 localhost NetworkManager[5988]: [1771571804.7144] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23719 uid=0 result="success" Feb 20 02:16:44 localhost NetworkManager[5988]: [1771571804.7610] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23734 uid=0 result="success" Feb 20 02:16:50 localhost sshd[23752]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:16:58 localhost sshd[23754]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:09 localhost sshd[23757]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:09 localhost sshd[23759]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:10 localhost python3[23774]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-00000000001b-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:17:15 localhost python3[23793]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 02:17:15 localhost python3[23809]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 02:17:16 localhost python3[23823]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 02:17:17 localhost python3[23839]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 20 02:17:17 localhost python3[23853]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Feb 20 02:17:18 localhost python3[23868]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005625204.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000022-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:17:19 localhost python3[23888]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:17:19 localhost systemd[1]: Starting Hostname Service... Feb 20 02:17:19 localhost systemd[1]: Started Hostname Service. Feb 20 02:17:19 localhost systemd-hostnamed[23892]: Hostname set to (static) Feb 20 02:17:19 localhost NetworkManager[5988]: [1771571839.7481] hostname: static hostname changed from "np0005625204.novalocal" to "np0005625204.localdomain" Feb 20 02:17:19 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 20 02:17:19 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 20 02:17:20 localhost systemd[1]: session-11.scope: Deactivated successfully. Feb 20 02:17:20 localhost systemd[1]: session-11.scope: Consumed 1min 44.180s CPU time. Feb 20 02:17:20 localhost systemd-logind[759]: Session 11 logged out. Waiting for processes to exit. Feb 20 02:17:20 localhost systemd-logind[759]: Removed session 11. Feb 20 02:17:21 localhost sshd[23903]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:23 localhost sshd[23905]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:24 localhost systemd-logind[759]: New session 12 of user zuul. Feb 20 02:17:24 localhost systemd[1]: Started Session 12 of User zuul. Feb 20 02:17:24 localhost python3[23922]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Feb 20 02:17:25 localhost systemd[1]: session-12.scope: Deactivated successfully. Feb 20 02:17:25 localhost systemd-logind[759]: Session 12 logged out. Waiting for processes to exit. Feb 20 02:17:25 localhost systemd-logind[759]: Removed session 12. Feb 20 02:17:29 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 20 02:17:31 localhost sshd[23923]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:37 localhost sshd[23925]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:38 localhost sshd[23927]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:46 localhost sshd[23929]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:17:49 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 20 02:17:53 localhost sshd[23934]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:03 localhost sshd[23936]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:10 localhost sshd[23938]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:15 localhost sshd[23940]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:16 localhost systemd-logind[759]: New session 13 of user zuul. Feb 20 02:18:16 localhost systemd[1]: Started Session 13 of User zuul. Feb 20 02:18:16 localhost python3[23959]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:18:19 localhost sshd[23974]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:19 localhost systemd[1]: Reloading. Feb 20 02:18:20 localhost systemd-rc-local-generator[23999]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:18:20 localhost systemd-sysv-generator[24002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:18:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:18:20 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Feb 20 02:18:20 localhost systemd[1]: Reloading. Feb 20 02:18:20 localhost systemd-rc-local-generator[24040]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:18:20 localhost systemd-sysv-generator[24043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:18:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:18:20 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Feb 20 02:18:20 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Feb 20 02:18:20 localhost systemd[1]: Reloading. Feb 20 02:18:20 localhost systemd-rc-local-generator[24084]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:18:20 localhost systemd-sysv-generator[24087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:18:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:18:20 localhost systemd[1]: Starting dnf makecache... Feb 20 02:18:20 localhost systemd[1]: Listening on LVM2 poll daemon socket. Feb 20 02:18:21 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:18:21 localhost dnf[24095]: Updating Subscription Management repositories. Feb 20 02:18:21 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 02:18:21 localhost systemd[1]: Reloading. Feb 20 02:18:21 localhost systemd-rc-local-generator[24134]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:18:21 localhost systemd-sysv-generator[24141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:18:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:18:21 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 02:18:21 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:18:21 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 02:18:21 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 02:18:21 localhost systemd[1]: run-r187169f6d8f44215b922a68d2dedb85d.service: Deactivated successfully. Feb 20 02:18:21 localhost systemd[1]: run-r1c752a7949614ed7b4fc52d4bf634731.service: Deactivated successfully. Feb 20 02:18:22 localhost dnf[24095]: Failed determining last makecache time. Feb 20 02:18:23 localhost dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 26 kB/s | 4.1 kB 00:00 Feb 20 02:18:23 localhost dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - High Av 51 kB/s | 4.0 kB 00:00 Feb 20 02:18:23 localhost dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 51 kB/s | 4.5 kB 00:00 Feb 20 02:18:23 localhost dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 48 kB/s | 4.1 kB 00:00 Feb 20 02:18:23 localhost dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 51 kB/s | 4.5 kB 00:00 Feb 20 02:18:23 localhost dnf[24095]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_ 48 kB/s | 4.0 kB 00:00 Feb 20 02:18:24 localhost dnf[24095]: Fast Datapath for RHEL 9 x86_64 (RPMs) 47 kB/s | 4.0 kB 00:00 Feb 20 02:18:24 localhost dnf[24095]: Metadata cache created. Feb 20 02:18:24 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 20 02:18:24 localhost systemd[1]: Finished dnf makecache. Feb 20 02:18:24 localhost systemd[1]: dnf-makecache.service: Consumed 2.889s CPU time. Feb 20 02:18:25 localhost sshd[24743]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:26 localhost sshd[24745]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:32 localhost sshd[24747]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:38 localhost sshd[24749]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:43 localhost sshd[24751]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:50 localhost sshd[24753]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:18:57 localhost sshd[24755]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:06 localhost sshd[24757]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:11 localhost sshd[24759]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:17 localhost sshd[24761]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:22 localhost systemd[1]: session-13.scope: Deactivated successfully. Feb 20 02:19:22 localhost systemd[1]: session-13.scope: Consumed 4.735s CPU time. Feb 20 02:19:22 localhost systemd-logind[759]: Session 13 logged out. Waiting for processes to exit. Feb 20 02:19:22 localhost systemd-logind[759]: Removed session 13. Feb 20 02:19:29 localhost sshd[24763]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:39 localhost sshd[24765]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:43 localhost sshd[24767]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:49 localhost sshd[24769]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:19:57 localhost sshd[24771]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:08 localhost sshd[24774]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:15 localhost sshd[24776]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:27 localhost sshd[24778]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:36 localhost sshd[24780]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:40 localhost sshd[24782]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:45 localhost sshd[24785]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:54 localhost sshd[24787]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:20:58 localhost sshd[24789]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:00 localhost sshd[24791]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:05 localhost sshd[24793]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:13 localhost sshd[24795]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:23 localhost sshd[24797]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:28 localhost sshd[24799]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:33 localhost sshd[24801]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:36 localhost sshd[24803]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:21:44 localhost sshd[24805]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:01 localhost sshd[24807]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:09 localhost sshd[24809]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:20 localhost sshd[24811]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:24 localhost sshd[24813]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:26 localhost sshd[24815]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:34 localhost sshd[24817]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:38 localhost sshd[24819]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:43 localhost sshd[24821]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:51 localhost sshd[24823]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:22:55 localhost sshd[24825]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:01 localhost sshd[24827]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:09 localhost sshd[24829]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:16 localhost sshd[24831]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:24 localhost sshd[24833]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:32 localhost sshd[24835]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:43 localhost sshd[24837]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:44 localhost sshd[24839]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:23:56 localhost sshd[24841]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:07 localhost sshd[24843]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:10 localhost sshd[24845]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:12 localhost sshd[24847]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:25 localhost sshd[24849]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:36 localhost sshd[24851]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:41 localhost sshd[24853]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:47 localhost sshd[24855]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:24:56 localhost sshd[24857]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:03 localhost sshd[24859]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:11 localhost sshd[24861]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:18 localhost sshd[24863]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:27 localhost sshd[24865]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:29 localhost sshd[24866]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:44 localhost sshd[24870]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:53 localhost sshd[24872]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:25:58 localhost sshd[24874]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:10 localhost sshd[24876]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:16 localhost sshd[24878]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:21 localhost sshd[24880]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:24 localhost sshd[24882]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:31 localhost sshd[24884]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:43 localhost sshd[24886]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:48 localhost sshd[24888]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:26:55 localhost sshd[24890]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:02 localhost sshd[24892]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:06 localhost sshd[24894]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:13 localhost sshd[24896]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:17 localhost sshd[24898]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:22 localhost sshd[24900]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:31 localhost sshd[24902]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:35 localhost sshd[24904]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:37 localhost sshd[24906]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:42 localhost sshd[24908]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:49 localhost sshd[24910]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:27:55 localhost sshd[24912]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:03 localhost sshd[24914]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:09 localhost sshd[24916]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:18 localhost sshd[24918]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:23 localhost sshd[24920]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:28 localhost sshd[24922]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:35 localhost sshd[24924]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:45 localhost sshd[24926]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:28:57 localhost sshd[24928]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:06 localhost sshd[24930]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:08 localhost sshd[24932]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:13 localhost sshd[24934]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:13 localhost sshd[24936]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:19 localhost sshd[24938]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:28 localhost sshd[24940]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:36 localhost sshd[24942]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:44 localhost sshd[24944]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:29:53 localhost sshd[24946]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:00 localhost sshd[24948]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:06 localhost sshd[24950]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:07 localhost sshd[24951]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:11 localhost sshd[24953]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:17 localhost sshd[24955]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:23 localhost sshd[24957]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:29 localhost sshd[24959]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:34 localhost sshd[24961]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:41 localhost sshd[24963]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:30:52 localhost sshd[24965]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:02 localhost sshd[24968]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:07 localhost sshd[24970]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:17 localhost sshd[24972]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:20 localhost sshd[24974]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:30 localhost sshd[24976]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:36 localhost sshd[24978]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:48 localhost sshd[24980]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:31:54 localhost sshd[24982]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:01 localhost sshd[24984]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:03 localhost sshd[24986]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:08 localhost sshd[24988]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:17 localhost sshd[24990]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:24 localhost sshd[24992]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:30 localhost sshd[24994]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:40 localhost sshd[24996]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:47 localhost sshd[24998]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:56 localhost sshd[25000]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:32:57 localhost sshd[25001]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:03 localhost sshd[25004]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:12 localhost sshd[25006]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:20 localhost sshd[25008]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:29 localhost sshd[25010]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:34 localhost sshd[25012]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:42 localhost sshd[25014]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:49 localhost sshd[25016]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:33:57 localhost sshd[25018]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:06 localhost sshd[25020]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:14 localhost sshd[25022]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:23 localhost sshd[25024]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:30 localhost sshd[25026]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:38 localhost sshd[25028]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:47 localhost sshd[25030]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:52 localhost sshd[25034]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:54 localhost sshd[25036]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:34:55 localhost sshd[25038]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:02 localhost sshd[25042]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:03 localhost systemd-logind[759]: New session 14 of user zuul. Feb 20 02:35:03 localhost systemd[1]: Started Session 14 of User zuul. Feb 20 02:35:03 localhost python3[25090]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 02:35:05 localhost python3[25177]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:35:08 localhost python3[25194]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:35:09 localhost python3[25210]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:09 localhost kernel: loop: module loaded Feb 20 02:35:09 localhost kernel: loop3: detected capacity change from 0 to 14680064 Feb 20 02:35:09 localhost sshd[25220]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:09 localhost python3[25236]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:09 localhost lvm[25239]: PV /dev/loop3 not used. Feb 20 02:35:10 localhost lvm[25248]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 20 02:35:10 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Feb 20 02:35:10 localhost lvm[25250]: 1 logical volume(s) in volume group "ceph_vg0" now active Feb 20 02:35:10 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Feb 20 02:35:10 localhost python3[25298]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:35:11 localhost python3[25342]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572910.4004626-54744-197095392528280/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:11 localhost python3[25372]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:35:11 localhost systemd[1]: Reloading. Feb 20 02:35:12 localhost systemd-rc-local-generator[25396]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:35:12 localhost systemd-sysv-generator[25402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:35:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:35:12 localhost systemd[1]: Starting Ceph OSD losetup... Feb 20 02:35:12 localhost bash[25412]: /dev/loop3: [64516]:9169619 (/var/lib/ceph-osd-0.img) Feb 20 02:35:12 localhost systemd[1]: Finished Ceph OSD losetup. Feb 20 02:35:12 localhost lvm[25413]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 20 02:35:12 localhost lvm[25413]: VG ceph_vg0 finished Feb 20 02:35:12 localhost python3[25430]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:35:15 localhost python3[25447]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:35:16 localhost python3[25463]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:16 localhost kernel: loop4: detected capacity change from 0 to 14680064 Feb 20 02:35:17 localhost python3[25485]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:17 localhost lvm[25488]: PV /dev/loop4 not used. Feb 20 02:35:17 localhost lvm[25498]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 20 02:35:17 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Feb 20 02:35:17 localhost lvm[25500]: 1 logical volume(s) in volume group "ceph_vg1" now active Feb 20 02:35:17 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Feb 20 02:35:17 localhost python3[25548]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:35:18 localhost python3[25591]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572917.5701373-54975-86165242145165/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:18 localhost sshd[25621]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:18 localhost python3[25622]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:35:18 localhost systemd[1]: Reloading. Feb 20 02:35:19 localhost systemd-rc-local-generator[25646]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:35:19 localhost systemd-sysv-generator[25649]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:35:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:35:19 localhost systemd[1]: Starting Ceph OSD losetup... Feb 20 02:35:19 localhost bash[25664]: /dev/loop4: [64516]:9171554 (/var/lib/ceph-osd-1.img) Feb 20 02:35:19 localhost systemd[1]: Finished Ceph OSD losetup. Feb 20 02:35:19 localhost lvm[25665]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 20 02:35:19 localhost lvm[25665]: VG ceph_vg1 finished Feb 20 02:35:28 localhost python3[25711]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Feb 20 02:35:29 localhost python3[25731]: ansible-hostname Invoked with name=np0005625204.localdomain use=None Feb 20 02:35:29 localhost systemd[1]: Starting Hostname Service... Feb 20 02:35:29 localhost systemd[1]: Started Hostname Service. Feb 20 02:35:31 localhost python3[25754]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Feb 20 02:35:32 localhost python3[25802]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.2_tj08awtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:32 localhost python3[25832]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.2_tj08awtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:33 localhost python3[25848]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.2_tj08awtmphosts insertbefore=BOF block=192.168.122.106 np0005625202.localdomain np0005625202#012192.168.122.106 np0005625202.ctlplane.localdomain np0005625202.ctlplane#012192.168.122.107 np0005625203.localdomain np0005625203#012192.168.122.107 np0005625203.ctlplane.localdomain np0005625203.ctlplane#012192.168.122.108 np0005625204.localdomain np0005625204#012192.168.122.108 np0005625204.ctlplane.localdomain np0005625204.ctlplane#012192.168.122.103 np0005625199.localdomain np0005625199#012192.168.122.103 np0005625199.ctlplane.localdomain np0005625199.ctlplane#012192.168.122.104 np0005625200.localdomain np0005625200#012192.168.122.104 np0005625200.ctlplane.localdomain np0005625200.ctlplane#012192.168.122.105 np0005625201.localdomain np0005625201#012192.168.122.105 np0005625201.ctlplane.localdomain np0005625201.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:33 localhost sshd[25857]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:33 localhost python3[25865]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.2_tj08awtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:34 localhost python3[25882]: ansible-file Invoked with path=/tmp/ansible.2_tj08awtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:36 localhost python3[25899]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:37 localhost python3[25917]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:35:39 localhost sshd[25919]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:41 localhost python3[25968]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:35:41 localhost sshd[25998]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:41 localhost python3[26014]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572940.6635659-55771-249721785478797/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:42 localhost python3[26045]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:35:43 localhost python3[26063]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:35:43 localhost chronyd[765]: chronyd exiting Feb 20 02:35:43 localhost systemd[1]: Stopping NTP client/server... Feb 20 02:35:43 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 20 02:35:43 localhost systemd[1]: Stopped NTP client/server. Feb 20 02:35:43 localhost systemd[1]: chronyd.service: Consumed 113ms CPU time, read 1.9M from disk, written 0B to disk. Feb 20 02:35:43 localhost systemd[1]: Starting NTP client/server... Feb 20 02:35:43 localhost chronyd[26071]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 20 02:35:43 localhost chronyd[26071]: Frequency -29.979 +/- 0.262 ppm read from /var/lib/chrony/drift Feb 20 02:35:43 localhost chronyd[26071]: Loaded seccomp filter (level 2) Feb 20 02:35:43 localhost systemd[1]: Started NTP client/server. Feb 20 02:35:44 localhost python3[26120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:35:44 localhost python3[26163]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572943.9772875-56003-222424667383480/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:35:45 localhost python3[26193]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:35:45 localhost systemd[1]: Reloading. Feb 20 02:35:45 localhost systemd-rc-local-generator[26214]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:35:45 localhost systemd-sysv-generator[26219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:35:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:35:45 localhost systemd[1]: Reloading. Feb 20 02:35:45 localhost systemd-rc-local-generator[26256]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:35:45 localhost systemd-sysv-generator[26261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:35:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:35:45 localhost systemd[1]: Starting chronyd online sources service... Feb 20 02:35:45 localhost chronyc[26269]: 200 OK Feb 20 02:35:45 localhost systemd[1]: chrony-online.service: Deactivated successfully. Feb 20 02:35:45 localhost systemd[1]: Finished chronyd online sources service. Feb 20 02:35:46 localhost python3[26285]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:46 localhost chronyd[26071]: System clock was stepped by 0.000000 seconds Feb 20 02:35:46 localhost python3[26302]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:35:48 localhost chronyd[26071]: Selected source 23.133.168.245 (pool.ntp.org) Feb 20 02:35:49 localhost sshd[26304]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:54 localhost sshd[26306]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:35:57 localhost python3[26323]: ansible-timezone Invoked with name=UTC hwclock=None Feb 20 02:35:57 localhost systemd[1]: Starting Time & Date Service... Feb 20 02:35:57 localhost systemd[1]: Started Time & Date Service. Feb 20 02:35:58 localhost python3[26343]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:35:58 localhost chronyd[26071]: chronyd exiting Feb 20 02:35:58 localhost systemd[1]: Stopping NTP client/server... Feb 20 02:35:58 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 20 02:35:58 localhost systemd[1]: Stopped NTP client/server. Feb 20 02:35:58 localhost systemd[1]: Starting NTP client/server... Feb 20 02:35:58 localhost chronyd[26351]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 20 02:35:58 localhost chronyd[26351]: Frequency -29.979 +/- 0.262 ppm read from /var/lib/chrony/drift Feb 20 02:35:58 localhost chronyd[26351]: Loaded seccomp filter (level 2) Feb 20 02:35:58 localhost systemd[1]: Started NTP client/server. Feb 20 02:35:59 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 20 02:36:01 localhost sshd[26356]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:02 localhost chronyd[26351]: Selected source 216.232.132.102 (pool.ntp.org) Feb 20 02:36:13 localhost sshd[26358]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:19 localhost sshd[26553]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:25 localhost sshd[26555]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:27 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 20 02:36:35 localhost sshd[26559]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:42 localhost sshd[26561]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:51 localhost sshd[26563]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:36:52 localhost sshd[26565]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:04 localhost sshd[26567]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:15 localhost sshd[26569]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:22 localhost sshd[26571]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:31 localhost sshd[26573]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:34 localhost sshd[26575]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:38 localhost sshd[26577]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:44 localhost sshd[26579]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:50 localhost sshd[26581]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:37:56 localhost sshd[26583]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:02 localhost sshd[26585]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:06 localhost sshd[26587]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:06 localhost systemd-logind[759]: New session 15 of user ceph-admin. Feb 20 02:38:06 localhost systemd[1]: Created slice User Slice of UID 1002. Feb 20 02:38:06 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Feb 20 02:38:06 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Feb 20 02:38:06 localhost systemd[1]: Starting User Manager for UID 1002... Feb 20 02:38:06 localhost sshd[26591]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:06 localhost sshd[26606]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:06 localhost systemd[26592]: Queued start job for default target Main User Target. Feb 20 02:38:06 localhost systemd[26592]: Created slice User Application Slice. Feb 20 02:38:06 localhost systemd[26592]: Started Mark boot as successful after the user session has run 2 minutes. Feb 20 02:38:06 localhost systemd[26592]: Started Daily Cleanup of User's Temporary Directories. Feb 20 02:38:06 localhost systemd[26592]: Reached target Paths. Feb 20 02:38:06 localhost systemd[26592]: Reached target Timers. Feb 20 02:38:06 localhost systemd[26592]: Starting D-Bus User Message Bus Socket... Feb 20 02:38:06 localhost systemd[26592]: Starting Create User's Volatile Files and Directories... Feb 20 02:38:06 localhost systemd[26592]: Listening on D-Bus User Message Bus Socket. Feb 20 02:38:06 localhost systemd[26592]: Reached target Sockets. Feb 20 02:38:06 localhost systemd[26592]: Finished Create User's Volatile Files and Directories. Feb 20 02:38:06 localhost systemd[26592]: Reached target Basic System. Feb 20 02:38:06 localhost systemd[26592]: Reached target Main User Target. Feb 20 02:38:06 localhost systemd[26592]: Startup finished in 115ms. Feb 20 02:38:06 localhost systemd[1]: Started User Manager for UID 1002. Feb 20 02:38:06 localhost systemd[1]: Started Session 15 of User ceph-admin. Feb 20 02:38:06 localhost systemd-logind[759]: New session 17 of user ceph-admin. Feb 20 02:38:06 localhost systemd[1]: Started Session 17 of User ceph-admin. Feb 20 02:38:07 localhost sshd[26628]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:07 localhost systemd-logind[759]: New session 18 of user ceph-admin. Feb 20 02:38:07 localhost systemd[1]: Started Session 18 of User ceph-admin. Feb 20 02:38:07 localhost sshd[26647]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:07 localhost systemd-logind[759]: New session 19 of user ceph-admin. Feb 20 02:38:07 localhost systemd[1]: Started Session 19 of User ceph-admin. Feb 20 02:38:07 localhost sshd[26667]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:07 localhost systemd-logind[759]: New session 20 of user ceph-admin. Feb 20 02:38:07 localhost systemd[1]: Started Session 20 of User ceph-admin. Feb 20 02:38:08 localhost sshd[26686]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:08 localhost systemd-logind[759]: New session 21 of user ceph-admin. Feb 20 02:38:08 localhost systemd[1]: Started Session 21 of User ceph-admin. Feb 20 02:38:08 localhost sshd[26705]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:08 localhost systemd-logind[759]: New session 22 of user ceph-admin. Feb 20 02:38:08 localhost systemd[1]: Started Session 22 of User ceph-admin. Feb 20 02:38:09 localhost sshd[26724]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:09 localhost systemd-logind[759]: New session 23 of user ceph-admin. Feb 20 02:38:09 localhost systemd[1]: Started Session 23 of User ceph-admin. Feb 20 02:38:09 localhost sshd[26743]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:09 localhost systemd-logind[759]: New session 24 of user ceph-admin. Feb 20 02:38:09 localhost systemd[1]: Started Session 24 of User ceph-admin. Feb 20 02:38:09 localhost sshd[26762]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:09 localhost systemd-logind[759]: New session 25 of user ceph-admin. Feb 20 02:38:09 localhost systemd[1]: Started Session 25 of User ceph-admin. Feb 20 02:38:10 localhost sshd[26779]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:10 localhost systemd-logind[759]: New session 26 of user ceph-admin. Feb 20 02:38:10 localhost systemd[1]: Started Session 26 of User ceph-admin. Feb 20 02:38:10 localhost sshd[26798]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:10 localhost systemd-logind[759]: New session 27 of user ceph-admin. Feb 20 02:38:10 localhost systemd[1]: Started Session 27 of User ceph-admin. Feb 20 02:38:11 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:14 localhost sshd[26837]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:23 localhost sshd[26839]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:25 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:25 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:26 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:26 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:26 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 27016 (sysctl) Feb 20 02:38:26 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Feb 20 02:38:26 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Feb 20 02:38:27 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:27 localhost sshd[27087]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:27 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:31 localhost kernel: VFS: idmapped mount is not enabled. Feb 20 02:38:36 localhost sshd[27268]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:36 localhost sshd[27270]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:42 localhost sshd[27285]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:51 localhost podman[27156]: Feb 20 02:38:51 localhost podman[27156]: 2026-02-20 07:38:27.873420556 +0000 UTC m=+0.038013065 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:38:51 localhost podman[27156]: 2026-02-20 07:38:51.450891786 +0000 UTC m=+23.615484275 container create 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.42.2, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Feb 20 02:38:51 localhost systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3672036252-merged.mount: Deactivated successfully. Feb 20 02:38:51 localhost systemd[1]: Created slice Slice /machine. Feb 20 02:38:51 localhost systemd[1]: Started libpod-conmon-57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab.scope. Feb 20 02:38:51 localhost systemd[1]: Started libcrun container. Feb 20 02:38:51 localhost podman[27156]: 2026-02-20 07:38:51.560986311 +0000 UTC m=+23.725578830 container init 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git) Feb 20 02:38:51 localhost podman[27156]: 2026-02-20 07:38:51.575422045 +0000 UTC m=+23.740014554 container start 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, ceph=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.42.2) Feb 20 02:38:51 localhost podman[27156]: 2026-02-20 07:38:51.575821116 +0000 UTC m=+23.740413665 container attach 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_CLEAN=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, io.buildah.version=1.42.2, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7) Feb 20 02:38:51 localhost blissful_curran[27301]: 167 167 Feb 20 02:38:51 localhost systemd[1]: libpod-57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab.scope: Deactivated successfully. Feb 20 02:38:51 localhost podman[27156]: 2026-02-20 07:38:51.579424697 +0000 UTC m=+23.744017216 container died 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux , ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Feb 20 02:38:51 localhost podman[27307]: 2026-02-20 07:38:51.681165198 +0000 UTC m=+0.086372631 container remove 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, ceph=True, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, RELEASE=main, GIT_CLEAN=True, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:38:51 localhost systemd[1]: libpod-conmon-57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab.scope: Deactivated successfully. Feb 20 02:38:51 localhost podman[27328]: Feb 20 02:38:52 localhost podman[27328]: 2026-02-20 07:38:51.90323746 +0000 UTC m=+0.055926438 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:38:52 localhost systemd[1]: var-lib-containers-storage-overlay-625fe6f7ab104b417b2859b195c9cbaa95ba9966850788fccd54bf3f215481ed-merged.mount: Deactivated successfully. Feb 20 02:38:54 localhost sshd[27583]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:38:55 localhost podman[27328]: 2026-02-20 07:38:55.251660996 +0000 UTC m=+3.404349994 container create d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, ceph=True, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main) Feb 20 02:38:55 localhost systemd[1]: Started libpod-conmon-d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead.scope. Feb 20 02:38:55 localhost systemd[1]: Started libcrun container. Feb 20 02:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d86875042f7cf3862c6ab37bebedd3f25b67e045a4a5c79e56937af643f90b0/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d86875042f7cf3862c6ab37bebedd3f25b67e045a4a5c79e56937af643f90b0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:38:55 localhost podman[27328]: 2026-02-20 07:38:55.344562098 +0000 UTC m=+3.497251096 container init d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, description=Red Hat Ceph Storage 7) Feb 20 02:38:55 localhost podman[27328]: 2026-02-20 07:38:55.360114205 +0000 UTC m=+3.512803203 container start d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1770267347, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 02:38:55 localhost podman[27328]: 2026-02-20 07:38:55.360428303 +0000 UTC m=+3.513117351 container attach d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:38:56 localhost gracious_ramanujan[27586]: [ Feb 20 02:38:56 localhost gracious_ramanujan[27586]: { Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "available": false, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "ceph_device": false, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "lsm_data": {}, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "lvs": [], Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "path": "/dev/sr0", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "rejected_reasons": [ Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "Insufficient space (<5GB)", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "Has a FileSystem" Feb 20 02:38:56 localhost gracious_ramanujan[27586]: ], Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "sys_api": { Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "actuators": null, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "device_nodes": "sr0", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "human_readable_size": "482.00 KB", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "id_bus": "ata", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "model": "QEMU DVD-ROM", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "nr_requests": "2", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "partitions": {}, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "path": "/dev/sr0", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "removable": "1", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "rev": "2.5+", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "ro": "0", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "rotational": "1", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "sas_address": "", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "sas_device_handle": "", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "scheduler_mode": "mq-deadline", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "sectors": 0, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "sectorsize": "2048", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "size": 493568.0, Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "support_discard": "0", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "type": "disk", Feb 20 02:38:56 localhost gracious_ramanujan[27586]: "vendor": "QEMU" Feb 20 02:38:56 localhost gracious_ramanujan[27586]: } Feb 20 02:38:56 localhost gracious_ramanujan[27586]: } Feb 20 02:38:56 localhost gracious_ramanujan[27586]: ] Feb 20 02:38:56 localhost systemd[1]: libpod-d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead.scope: Deactivated successfully. Feb 20 02:38:56 localhost podman[27328]: 2026-02-20 07:38:56.116309462 +0000 UTC m=+4.268998490 container died d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7) Feb 20 02:38:56 localhost systemd[1]: var-lib-containers-storage-overlay-4d86875042f7cf3862c6ab37bebedd3f25b67e045a4a5c79e56937af643f90b0-merged.mount: Deactivated successfully. Feb 20 02:38:56 localhost podman[28972]: 2026-02-20 07:38:56.178336449 +0000 UTC m=+0.056506784 container remove d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.expose-services=, name=rhceph, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:38:56 localhost systemd[1]: libpod-conmon-d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead.scope: Deactivated successfully. Feb 20 02:38:56 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:56 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:38:56 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Feb 20 02:38:56 localhost systemd[1]: Closed Process Core Dump Socket. Feb 20 02:38:56 localhost systemd[1]: Stopping Process Core Dump Socket... Feb 20 02:38:56 localhost systemd[1]: Listening on Process Core Dump Socket. Feb 20 02:38:56 localhost systemd[1]: Reloading. Feb 20 02:38:56 localhost systemd-sysv-generator[29057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:38:56 localhost systemd-rc-local-generator[29052]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:38:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:38:57 localhost systemd[1]: Reloading. Feb 20 02:38:57 localhost systemd-rc-local-generator[29092]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:38:57 localhost systemd-sysv-generator[29098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:38:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:01 localhost sshd[29104]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:06 localhost sshd[29106]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:13 localhost sshd[29108]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:19 localhost sshd[29110]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:25 localhost sshd[29155]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:25 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:39:25 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:39:25 localhost podman[29185]: Feb 20 02:39:25 localhost podman[29185]: 2026-02-20 07:39:25.945478043 +0000 UTC m=+0.070302589 container create 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 02:39:25 localhost systemd[1]: Started libpod-conmon-787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b.scope. Feb 20 02:39:26 localhost systemd[1]: Started libcrun container. Feb 20 02:39:26 localhost podman[29185]: 2026-02-20 07:39:26.015694078 +0000 UTC m=+0.140518644 container init 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main) Feb 20 02:39:26 localhost podman[29185]: 2026-02-20 07:39:25.918476923 +0000 UTC m=+0.043301499 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:26 localhost podman[29185]: 2026-02-20 07:39:26.024995695 +0000 UTC m=+0.149820271 container start 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, distribution-scope=public) Feb 20 02:39:26 localhost podman[29185]: 2026-02-20 07:39:26.025253523 +0000 UTC m=+0.150078089 container attach 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, release=1770267347, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public) Feb 20 02:39:26 localhost eager_kilby[29200]: 167 167 Feb 20 02:39:26 localhost systemd[1]: libpod-787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b.scope: Deactivated successfully. Feb 20 02:39:26 localhost podman[29185]: 2026-02-20 07:39:26.029664649 +0000 UTC m=+0.154489265 container died 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, release=1770267347, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-02-09T10:25:24Z) Feb 20 02:39:26 localhost podman[29205]: 2026-02-20 07:39:26.115189388 +0000 UTC m=+0.076313657 container remove 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347) Feb 20 02:39:26 localhost systemd[1]: libpod-conmon-787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b.scope: Deactivated successfully. Feb 20 02:39:26 localhost systemd[1]: Reloading. Feb 20 02:39:26 localhost systemd-rc-local-generator[29242]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:26 localhost systemd-sysv-generator[29247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:26 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:39:26 localhost systemd[1]: Reloading. Feb 20 02:39:26 localhost systemd-rc-local-generator[29283]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:26 localhost systemd-sysv-generator[29287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:26 localhost systemd[1]: Reached target All Ceph clusters and services. Feb 20 02:39:26 localhost systemd[1]: Reloading. Feb 20 02:39:26 localhost systemd-sysv-generator[29326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:26 localhost systemd-rc-local-generator[29322]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:26 localhost systemd[1]: Reached target Ceph cluster a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 02:39:26 localhost systemd[1]: Reloading. Feb 20 02:39:26 localhost systemd-rc-local-generator[29360]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:26 localhost systemd-sysv-generator[29366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:27 localhost systemd[1]: Reloading. Feb 20 02:39:27 localhost systemd-rc-local-generator[29404]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:27 localhost systemd-sysv-generator[29408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:27 localhost systemd[1]: Created slice Slice /system/ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 02:39:27 localhost systemd[1]: Reached target System Time Set. Feb 20 02:39:27 localhost systemd[1]: Reached target System Time Synchronized. Feb 20 02:39:27 localhost systemd[1]: Starting Ceph crash.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 02:39:27 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:39:27 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 20 02:39:27 localhost podman[29464]: Feb 20 02:39:27 localhost podman[29464]: 2026-02-20 07:39:27.726997503 +0000 UTC m=+0.076411991 container create 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 20 02:39:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19ace6a7f463760f85db583445ebc2a43a1a1df86c33f2f74f5f1cf9aa4feaca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19ace6a7f463760f85db583445ebc2a43a1a1df86c33f2f74f5f1cf9aa4feaca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:27 localhost podman[29464]: 2026-02-20 07:39:27.695733542 +0000 UTC m=+0.045148030 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19ace6a7f463760f85db583445ebc2a43a1a1df86c33f2f74f5f1cf9aa4feaca/merged/etc/ceph/ceph.client.crash.np0005625204.keyring supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:27 localhost podman[29464]: 2026-02-20 07:39:27.815736358 +0000 UTC m=+0.165150856 container init 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.42.2, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:27 localhost podman[29464]: 2026-02-20 07:39:27.82759134 +0000 UTC m=+0.177005838 container start 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, build-date=2026-02-09T10:25:24Z, version=7, vcs-type=git, io.buildah.version=1.42.2, name=rhceph, description=Red Hat Ceph Storage 7, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:39:27 localhost bash[29464]: 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 Feb 20 02:39:27 localhost systemd[1]: Started Ceph crash.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 02:39:27 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: INFO:ceph-crash:pinging cluster to exercise our key, trying key client.crash.np0005625204. Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: cluster: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: id: a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: health: HEALTH_WARN Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: OSD count 0 < osd_pool_default_size 3 Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: services: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: mon: 3 daemons, quorum np0005625199,np0005625201,np0005625200 (age 13s) Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: mgr: np0005625199.ileebh(active, since 2m), standbys: np0005625201.mtnyvu, np0005625200.ypbkax Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: osd: 0 osds: 0 up, 0 in Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: data: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: pools: 0 pools, 0 pgs Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: objects: 0 objects, 0 B Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: usage: 0 B used, 0 B / 0 B avail Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: pgs: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: progress: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: Updating crash deployment (+4 -> 6) (8s) Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: [==============..............] (remaining: 8s) Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: Feb 20 02:39:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Feb 20 02:39:30 localhost sshd[29506]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:31 localhost podman[29576]: Feb 20 02:39:31 localhost podman[29576]: 2026-02-20 07:39:31.482410246 +0000 UTC m=+0.057528078 container create 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, ceph=True, build-date=2026-02-09T10:25:24Z, distribution-scope=public, release=1770267347, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 02:39:31 localhost systemd[1]: Started libpod-conmon-57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64.scope. Feb 20 02:39:31 localhost systemd[1]: tmp-crun.60qbEo.mount: Deactivated successfully. Feb 20 02:39:31 localhost systemd[1]: Started libcrun container. Feb 20 02:39:31 localhost podman[29576]: 2026-02-20 07:39:31.454919499 +0000 UTC m=+0.030037361 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:31 localhost podman[29576]: 2026-02-20 07:39:31.559757426 +0000 UTC m=+0.134875258 container init 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, io.buildah.version=1.42.2, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_BRANCH=main, architecture=x86_64, build-date=2026-02-09T10:25:24Z, version=7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:39:31 localhost podman[29576]: 2026-02-20 07:39:31.567278085 +0000 UTC m=+0.142395917 container start 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, release=1770267347, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 02:39:31 localhost podman[29576]: 2026-02-20 07:39:31.567561114 +0000 UTC m=+0.142678986 container attach 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, GIT_CLEAN=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=) Feb 20 02:39:31 localhost xenodochial_khorana[29591]: 167 167 Feb 20 02:39:31 localhost systemd[1]: libpod-57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64.scope: Deactivated successfully. Feb 20 02:39:31 localhost podman[29576]: 2026-02-20 07:39:31.571194663 +0000 UTC m=+0.146312525 container died 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph) Feb 20 02:39:31 localhost podman[29596]: 2026-02-20 07:39:31.665736071 +0000 UTC m=+0.080158604 container remove 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main) Feb 20 02:39:31 localhost systemd[1]: libpod-conmon-57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64.scope: Deactivated successfully. Feb 20 02:39:31 localhost podman[29617]: Feb 20 02:39:31 localhost podman[29617]: 2026-02-20 07:39:31.878905839 +0000 UTC m=+0.079086328 container create f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1770267347, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main) Feb 20 02:39:31 localhost systemd[1]: Started libpod-conmon-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope. Feb 20 02:39:31 localhost systemd[1]: Started libcrun container. Feb 20 02:39:31 localhost podman[29617]: 2026-02-20 07:39:31.842185949 +0000 UTC m=+0.042366448 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:31 localhost podman[29617]: 2026-02-20 07:39:31.997045875 +0000 UTC m=+0.197226344 container init f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container) Feb 20 02:39:32 localhost podman[29617]: 2026-02-20 07:39:32.007335804 +0000 UTC m=+0.207516293 container start f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, version=7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public) Feb 20 02:39:32 localhost podman[29617]: 2026-02-20 07:39:32.007593703 +0000 UTC m=+0.207774192 container attach f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, version=7, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public) Feb 20 02:39:32 localhost hopeful_chandrasekhar[29632]: --> passed data devices: 0 physical, 2 LVM Feb 20 02:39:32 localhost hopeful_chandrasekhar[29632]: --> relative data size: 1.0 Feb 20 02:39:32 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 20 02:39:32 localhost systemd[1]: tmp-crun.ZYF5rj.mount: Deactivated successfully. Feb 20 02:39:32 localhost systemd[1]: var-lib-containers-storage-overlay-12aef165a6b94b5a43546700d80dfe1cf5541d2ba7c6202746b560edbf0420bc-merged.mount: Deactivated successfully. Feb 20 02:39:32 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 246e60bc-5fa8-45c8-b746-372a7c540a58 Feb 20 02:39:33 localhost lvm[29688]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 20 02:39:33 localhost lvm[29688]: VG ceph_vg0 finished Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0 Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: stderr: got monmap epoch 3 Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: --> Creating keyring file for osd.0 Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/ Feb 20 02:39:33 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 246e60bc-5fa8-45c8-b746-372a7c540a58 --setuser ceph --setgroup ceph Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: stderr: 2026-02-20T07:39:33.652+0000 7f0743d42a80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: stderr: 2026-02-20T07:39:33.653+0000 7f0743d42a80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: --> ceph-volume lvm activate successful for osd ID: 0 Feb 20 02:39:35 localhost hopeful_chandrasekhar[29632]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 1635aa65-16b7-4b42-b3ab-efa9a5fbb750 Feb 20 02:39:36 localhost lvm[30637]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 20 02:39:36 localhost lvm[30637]: VG ceph_vg1 finished Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3 Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:36 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap Feb 20 02:39:37 localhost hopeful_chandrasekhar[29632]: stderr: got monmap epoch 3 Feb 20 02:39:37 localhost hopeful_chandrasekhar[29632]: --> Creating keyring file for osd.3 Feb 20 02:39:37 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring Feb 20 02:39:37 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/ Feb 20 02:39:37 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 1635aa65-16b7-4b42-b3ab-efa9a5fbb750 --setuser ceph --setgroup ceph Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: stderr: 2026-02-20T07:39:37.240+0000 7f5b6ae59a80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: stderr: 2026-02-20T07:39:37.240+0000 7f5b6ae59a80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: --> ceph-volume lvm activate successful for osd ID: 3 Feb 20 02:39:39 localhost hopeful_chandrasekhar[29632]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Feb 20 02:39:39 localhost systemd[1]: libpod-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope: Deactivated successfully. Feb 20 02:39:39 localhost systemd[1]: libpod-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope: Consumed 3.675s CPU time. Feb 20 02:39:39 localhost podman[31555]: 2026-02-20 07:39:39.635213312 +0000 UTC m=+0.038139349 container died f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 02:39:39 localhost systemd[1]: var-lib-containers-storage-overlay-17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541-merged.mount: Deactivated successfully. Feb 20 02:39:39 localhost podman[31555]: 2026-02-20 07:39:39.669824393 +0000 UTC m=+0.072750390 container remove f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:39:39 localhost systemd[1]: libpod-conmon-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope: Deactivated successfully. Feb 20 02:39:39 localhost sshd[31601]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:40 localhost podman[31638]: Feb 20 02:39:40 localhost podman[31638]: 2026-02-20 07:39:40.337371743 +0000 UTC m=+0.107400792 container create 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, release=1770267347, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Feb 20 02:39:40 localhost podman[31638]: 2026-02-20 07:39:40.256588489 +0000 UTC m=+0.026617578 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:40 localhost systemd[1]: Started libpod-conmon-89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289.scope. Feb 20 02:39:40 localhost systemd[1]: Started libcrun container. Feb 20 02:39:40 localhost podman[31638]: 2026-02-20 07:39:40.402057145 +0000 UTC m=+0.172086204 container init 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 02:39:40 localhost podman[31638]: 2026-02-20 07:39:40.412044965 +0000 UTC m=+0.182074024 container start 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, release=1770267347, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.expose-services=) Feb 20 02:39:40 localhost podman[31638]: 2026-02-20 07:39:40.412337635 +0000 UTC m=+0.182366684 container attach 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux , release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 02:39:40 localhost determined_franklin[31654]: 167 167 Feb 20 02:39:40 localhost systemd[1]: libpod-89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289.scope: Deactivated successfully. Feb 20 02:39:40 localhost podman[31638]: 2026-02-20 07:39:40.415353394 +0000 UTC m=+0.185382463 container died 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, GIT_CLEAN=True, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=) Feb 20 02:39:40 localhost podman[31659]: 2026-02-20 07:39:40.500988248 +0000 UTC m=+0.076916838 container remove 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z) Feb 20 02:39:40 localhost systemd[1]: libpod-conmon-89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289.scope: Deactivated successfully. Feb 20 02:39:40 localhost systemd[1]: var-lib-containers-storage-overlay-88fd5bb677778f9d42b1999f16126bd7fc0445005e779092c3691ea9909bd16f-merged.mount: Deactivated successfully. Feb 20 02:39:40 localhost podman[31679]: Feb 20 02:39:40 localhost podman[31679]: 2026-02-20 07:39:40.674851561 +0000 UTC m=+0.038811481 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:40 localhost podman[31679]: 2026-02-20 07:39:40.989229286 +0000 UTC m=+0.353189166 container create cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2) Feb 20 02:39:41 localhost systemd[1]: Started libpod-conmon-cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308.scope. Feb 20 02:39:41 localhost systemd[1]: Started libcrun container. Feb 20 02:39:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:41 localhost podman[31679]: 2026-02-20 07:39:41.059238894 +0000 UTC m=+0.423198774 container init cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64) Feb 20 02:39:41 localhost systemd[1]: tmp-crun.oapkBT.mount: Deactivated successfully. Feb 20 02:39:41 localhost podman[31679]: 2026-02-20 07:39:41.07336577 +0000 UTC m=+0.437325660 container start cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main) Feb 20 02:39:41 localhost podman[31679]: 2026-02-20 07:39:41.073636549 +0000 UTC m=+0.437596439 container attach cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Feb 20 02:39:41 localhost laughing_hamilton[31694]: { Feb 20 02:39:41 localhost laughing_hamilton[31694]: "0": [ Feb 20 02:39:41 localhost laughing_hamilton[31694]: { Feb 20 02:39:41 localhost laughing_hamilton[31694]: "devices": [ Feb 20 02:39:41 localhost laughing_hamilton[31694]: "/dev/loop3" Feb 20 02:39:41 localhost laughing_hamilton[31694]: ], Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_name": "ceph_lv0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_size": "7511998464", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=z1sb4p-FPG9-bE4e-guP3-EYfP-SVIc-HQbUof,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=a8557ee9-b55d-5519-942c-cf8f6172f1d8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=246e60bc-5fa8-45c8-b746-372a7c540a58,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_uuid": "z1sb4p-FPG9-bE4e-guP3-EYfP-SVIc-HQbUof", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "name": "ceph_lv0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "path": "/dev/ceph_vg0/ceph_lv0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "tags": { Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.block_uuid": "z1sb4p-FPG9-bE4e-guP3-EYfP-SVIc-HQbUof", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.cephx_lockbox_secret": "", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.cluster_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.cluster_name": "ceph", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.crush_device_class": "", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.encrypted": "0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.osd_fsid": "246e60bc-5fa8-45c8-b746-372a7c540a58", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.osd_id": "0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.osdspec_affinity": "default_drive_group", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.type": "block", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.vdo": "0" Feb 20 02:39:41 localhost laughing_hamilton[31694]: }, Feb 20 02:39:41 localhost laughing_hamilton[31694]: "type": "block", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "vg_name": "ceph_vg0" Feb 20 02:39:41 localhost laughing_hamilton[31694]: } Feb 20 02:39:41 localhost laughing_hamilton[31694]: ], Feb 20 02:39:41 localhost laughing_hamilton[31694]: "3": [ Feb 20 02:39:41 localhost laughing_hamilton[31694]: { Feb 20 02:39:41 localhost laughing_hamilton[31694]: "devices": [ Feb 20 02:39:41 localhost laughing_hamilton[31694]: "/dev/loop4" Feb 20 02:39:41 localhost laughing_hamilton[31694]: ], Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_name": "ceph_lv1", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_size": "7511998464", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=ci6Jyp-kDDl-Vyqq-NknI-f3us-8bH1-9NWith,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=a8557ee9-b55d-5519-942c-cf8f6172f1d8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=1635aa65-16b7-4b42-b3ab-efa9a5fbb750,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "lv_uuid": "ci6Jyp-kDDl-Vyqq-NknI-f3us-8bH1-9NWith", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "name": "ceph_lv1", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "path": "/dev/ceph_vg1/ceph_lv1", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "tags": { Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.block_uuid": "ci6Jyp-kDDl-Vyqq-NknI-f3us-8bH1-9NWith", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.cephx_lockbox_secret": "", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.cluster_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.cluster_name": "ceph", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.crush_device_class": "", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.encrypted": "0", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.osd_fsid": "1635aa65-16b7-4b42-b3ab-efa9a5fbb750", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.osd_id": "3", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.osdspec_affinity": "default_drive_group", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.type": "block", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "ceph.vdo": "0" Feb 20 02:39:41 localhost laughing_hamilton[31694]: }, Feb 20 02:39:41 localhost laughing_hamilton[31694]: "type": "block", Feb 20 02:39:41 localhost laughing_hamilton[31694]: "vg_name": "ceph_vg1" Feb 20 02:39:41 localhost laughing_hamilton[31694]: } Feb 20 02:39:41 localhost laughing_hamilton[31694]: ] Feb 20 02:39:41 localhost laughing_hamilton[31694]: } Feb 20 02:39:41 localhost systemd[1]: libpod-cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308.scope: Deactivated successfully. Feb 20 02:39:41 localhost podman[31679]: 2026-02-20 07:39:41.40821469 +0000 UTC m=+0.772174630 container died cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:41 localhost podman[31704]: 2026-02-20 07:39:41.480814674 +0000 UTC m=+0.065773259 container remove cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, ceph=True, architecture=x86_64, io.buildah.version=1.42.2) Feb 20 02:39:41 localhost systemd[1]: libpod-conmon-cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308.scope: Deactivated successfully. Feb 20 02:39:41 localhost systemd[1]: var-lib-containers-storage-overlay-b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1-merged.mount: Deactivated successfully. Feb 20 02:39:42 localhost podman[31791]: Feb 20 02:39:42 localhost podman[31791]: 2026-02-20 07:39:42.175660825 +0000 UTC m=+0.056614018 container create 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:42 localhost systemd[1]: Started libpod-conmon-7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049.scope. Feb 20 02:39:42 localhost systemd[1]: Started libcrun container. Feb 20 02:39:42 localhost podman[31791]: 2026-02-20 07:39:42.234379811 +0000 UTC m=+0.115333014 container init 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 02:39:42 localhost podman[31791]: 2026-02-20 07:39:42.243306756 +0000 UTC m=+0.124259979 container start 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, release=1770267347, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:42 localhost podman[31791]: 2026-02-20 07:39:42.243679698 +0000 UTC m=+0.124632891 container attach 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, ceph=True, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7) Feb 20 02:39:42 localhost youthful_visvesvaraya[31806]: 167 167 Feb 20 02:39:42 localhost systemd[1]: libpod-7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049.scope: Deactivated successfully. Feb 20 02:39:42 localhost podman[31791]: 2026-02-20 07:39:42.246983357 +0000 UTC m=+0.127936600 container died 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, ceph=True, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, version=7, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:42 localhost podman[31791]: 2026-02-20 07:39:42.148961475 +0000 UTC m=+0.029914708 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:42 localhost podman[31811]: 2026-02-20 07:39:42.335120032 +0000 UTC m=+0.074676582 container remove 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, release=1770267347, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7) Feb 20 02:39:42 localhost systemd[1]: libpod-conmon-7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049.scope: Deactivated successfully. Feb 20 02:39:42 localhost systemd[1]: tmp-crun.hw6ubF.mount: Deactivated successfully. Feb 20 02:39:42 localhost systemd[1]: var-lib-containers-storage-overlay-9b0dadab9abd311e939ddbceacdf99b9c066f88d556eb5a95407461778d365e2-merged.mount: Deactivated successfully. Feb 20 02:39:42 localhost podman[31838]: Feb 20 02:39:42 localhost podman[31838]: 2026-02-20 07:39:42.655764585 +0000 UTC m=+0.071581861 container create 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True) Feb 20 02:39:42 localhost systemd[1]: Started libpod-conmon-21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9.scope. Feb 20 02:39:42 localhost systemd[1]: Started libcrun container. Feb 20 02:39:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:42 localhost podman[31838]: 2026-02-20 07:39:42.627587556 +0000 UTC m=+0.043404842 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:42 localhost podman[31838]: 2026-02-20 07:39:42.775977839 +0000 UTC m=+0.191795115 container init 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.42.2, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=) Feb 20 02:39:42 localhost podman[31838]: 2026-02-20 07:39:42.791518911 +0000 UTC m=+0.207336187 container start 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git) Feb 20 02:39:42 localhost podman[31838]: 2026-02-20 07:39:42.791838391 +0000 UTC m=+0.207655677 container attach 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, maintainer=Guillaume Abrioux , ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Feb 20 02:39:42 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test[31853]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Feb 20 02:39:42 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test[31853]: [--no-systemd] [--no-tmpfs] Feb 20 02:39:42 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test[31853]: ceph-volume activate: error: unrecognized arguments: --bad-option Feb 20 02:39:43 localhost systemd[1]: libpod-21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9.scope: Deactivated successfully. Feb 20 02:39:43 localhost podman[31838]: 2026-02-20 07:39:43.007244684 +0000 UTC m=+0.423061970 container died 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, version=7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 02:39:43 localhost podman[31858]: 2026-02-20 07:39:43.097400146 +0000 UTC m=+0.077393502 container remove 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2) Feb 20 02:39:43 localhost systemd-journald[618]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 20 02:39:43 localhost systemd-journald[618]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 02:39:43 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 02:39:43 localhost systemd[1]: libpod-conmon-21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9.scope: Deactivated successfully. Feb 20 02:39:43 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 02:39:43 localhost systemd[1]: Reloading. Feb 20 02:39:43 localhost systemd-rc-local-generator[31913]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:43 localhost systemd-sysv-generator[31917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:43 localhost systemd[1]: var-lib-containers-storage-overlay-a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950-merged.mount: Deactivated successfully. Feb 20 02:39:43 localhost systemd[1]: tmp-crun.KO4bMA.mount: Deactivated successfully. Feb 20 02:39:43 localhost systemd[1]: Reloading. Feb 20 02:39:43 localhost systemd-rc-local-generator[31954]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:43 localhost systemd-sysv-generator[31960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:43 localhost systemd[1]: Starting Ceph osd.0 for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 02:39:44 localhost podman[32018]: Feb 20 02:39:44 localhost podman[32018]: 2026-02-20 07:39:44.201573114 +0000 UTC m=+0.068910994 container create b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public, release=1770267347, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git) Feb 20 02:39:44 localhost systemd[1]: Started libcrun container. Feb 20 02:39:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:44 localhost podman[32018]: 2026-02-20 07:39:44.173498898 +0000 UTC m=+0.040836778 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:44 localhost podman[32018]: 2026-02-20 07:39:44.316216904 +0000 UTC m=+0.183554784 container init b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.expose-services=, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Feb 20 02:39:44 localhost podman[32018]: 2026-02-20 07:39:44.322428889 +0000 UTC m=+0.189766769 container start b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z) Feb 20 02:39:44 localhost podman[32018]: 2026-02-20 07:39:44.322638936 +0000 UTC m=+0.189976816 container attach b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7) Feb 20 02:39:44 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Feb 20 02:39:44 localhost bash[32018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Feb 20 02:39:44 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Feb 20 02:39:44 localhost bash[32018]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Feb 20 02:39:44 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Feb 20 02:39:44 localhost bash[32018]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Feb 20 02:39:44 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 20 02:39:44 localhost bash[32018]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 20 02:39:44 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:44 localhost bash[32018]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:44 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Feb 20 02:39:44 localhost bash[32018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Feb 20 02:39:45 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: --> ceph-volume raw activate successful for osd ID: 0 Feb 20 02:39:45 localhost bash[32018]: --> ceph-volume raw activate successful for osd ID: 0 Feb 20 02:39:45 localhost systemd[1]: libpod-b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0.scope: Deactivated successfully. Feb 20 02:39:45 localhost podman[32146]: 2026-02-20 07:39:45.085451467 +0000 UTC m=+0.038125278 container died b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Feb 20 02:39:45 localhost systemd[1]: tmp-crun.N3mrgR.mount: Deactivated successfully. Feb 20 02:39:45 localhost systemd[1]: var-lib-containers-storage-overlay-46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c-merged.mount: Deactivated successfully. Feb 20 02:39:45 localhost podman[32146]: 2026-02-20 07:39:45.136643954 +0000 UTC m=+0.089317685 container remove b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, GIT_BRANCH=main, release=1770267347, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph) Feb 20 02:39:45 localhost podman[32207]: Feb 20 02:39:45 localhost podman[32207]: 2026-02-20 07:39:45.433240594 +0000 UTC m=+0.069225533 container create ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.openshift.expose-services=) Feb 20 02:39:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:45 localhost podman[32207]: 2026-02-20 07:39:45.406152381 +0000 UTC m=+0.042137340 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:45 localhost podman[32207]: 2026-02-20 07:39:45.550885403 +0000 UTC m=+0.186870342 container init ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, name=rhceph, io.buildah.version=1.42.2) Feb 20 02:39:45 localhost podman[32207]: 2026-02-20 07:39:45.558825195 +0000 UTC m=+0.194810134 container start ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Feb 20 02:39:45 localhost bash[32207]: ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 Feb 20 02:39:45 localhost systemd[1]: Started Ceph osd.0 for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 02:39:45 localhost ceph-osd[32226]: set uid:gid to 167:167 (ceph:ceph) Feb 20 02:39:45 localhost ceph-osd[32226]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2 Feb 20 02:39:45 localhost ceph-osd[32226]: pidfile_write: ignore empty --pid-file Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:45 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:45 localhost ceph-osd[32226]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) close Feb 20 02:39:45 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) close Feb 20 02:39:46 localhost ceph-osd[32226]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal Feb 20 02:39:46 localhost ceph-osd[32226]: load: jerasure load: lrc Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) close Feb 20 02:39:46 localhost podman[32319]: Feb 20 02:39:46 localhost podman[32319]: 2026-02-20 07:39:46.34601191 +0000 UTC m=+0.068117157 container create 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, distribution-scope=public, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Feb 20 02:39:46 localhost systemd[1]: Started libpod-conmon-7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306.scope. Feb 20 02:39:46 localhost systemd[1]: Started libcrun container. Feb 20 02:39:46 localhost podman[32319]: 2026-02-20 07:39:46.412049838 +0000 UTC m=+0.134155075 container init 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, release=1770267347, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main) Feb 20 02:39:46 localhost podman[32319]: 2026-02-20 07:39:46.318813343 +0000 UTC m=+0.040918610 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:46 localhost podman[32319]: 2026-02-20 07:39:46.42276455 +0000 UTC m=+0.144869787 container start 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, io.buildah.version=1.42.2, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vendor=Red Hat, Inc., GIT_CLEAN=True) Feb 20 02:39:46 localhost podman[32319]: 2026-02-20 07:39:46.423738983 +0000 UTC m=+0.145844240 container attach 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, build-date=2026-02-09T10:25:24Z, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Feb 20 02:39:46 localhost funny_carver[32334]: 167 167 Feb 20 02:39:46 localhost systemd[1]: libpod-7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306.scope: Deactivated successfully. Feb 20 02:39:46 localhost podman[32319]: 2026-02-20 07:39:46.426928858 +0000 UTC m=+0.149034125 container died 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, GIT_CLEAN=True) Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) close Feb 20 02:39:46 localhost podman[32341]: 2026-02-20 07:39:46.510968499 +0000 UTC m=+0.072302395 container remove 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.42.2, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 02:39:46 localhost systemd[1]: libpod-conmon-7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306.scope: Deactivated successfully. Feb 20 02:39:46 localhost ceph-osd[32226]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Feb 20 02:39:46 localhost ceph-osd[32226]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs mount Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs mount shared_bdev_used = 0 Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: RocksDB version: 7.9.2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Git sha 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: DB SUMMARY Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: DB Session ID: B1RZZQUR7VWFY9T1SVAY Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: CURRENT file: CURRENT Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: IDENTITY file: IDENTITY Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.error_if_exists: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.create_if_missing: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.env: 0x55bf8e014c40 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.fs: LegacyFileSystem Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.info_log: 0x55bf8ed1c7c0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_file_opening_threads: 16 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.statistics: (nil) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_fsync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_log_file_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.log_file_time_to_roll: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.keep_log_file_num: 1000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.recycle_log_file_num: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_fallocate: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_mmap_reads: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_mmap_writes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_direct_reads: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.create_missing_column_families: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.db_log_dir: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_dir: db.wal Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_cache_numshardbits: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.advise_random_on_open: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.db_write_buffer_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_manager: 0x55bf8dd6a140 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_adaptive_mutex: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.rate_limiter: (nil) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_recovery_mode: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_thread_tracking: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_pipelined_write: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.unordered_write: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.row_cache: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_ingest_behind: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.two_write_queues: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.manual_wal_flush: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_compression: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.atomic_flush: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.persist_stats_to_disk: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.log_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.best_efforts_recovery: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_data_in_errors: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.db_host_id: __hostname__ Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enforce_single_del_contracts: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_background_jobs: 4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_background_compactions: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_subcompactions: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.delayed_write_rate : 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.stats_dump_period_sec: 600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.stats_persist_period_sec: 600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_open_files: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bytes_per_sync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_background_flushes: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Compression algorithms supported: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kZSTD supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kXpressCompression supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kBZip2Compression supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kLZ4Compression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kZlibCompression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kSnappyCompression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: DMutex implementation: pthread_mutex_t Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd58850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1cba0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1cba0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1cba0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 58a199a0-9c9f-484d-9a27-dda744c2ce19 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573186732885, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573186733097, "job": 1, "event": "recovery_finished"} Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025 Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240 Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000 Feb 20 02:39:46 localhost ceph-osd[32226]: freelist init Feb 20 02:39:46 localhost ceph-osd[32226]: freelist _read_cfg Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs umount Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) close Feb 20 02:39:46 localhost podman[32568]: Feb 20 02:39:46 localhost podman[32568]: 2026-02-20 07:39:46.832343046 +0000 UTC m=+0.073770344 container create f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 20 02:39:46 localhost systemd[1]: Started libpod-conmon-f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477.scope. Feb 20 02:39:46 localhost systemd[1]: Started libcrun container. Feb 20 02:39:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:46 localhost podman[32568]: 2026-02-20 07:39:46.808642394 +0000 UTC m=+0.050069752 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:46 localhost podman[32568]: 2026-02-20 07:39:46.960555702 +0000 UTC m=+0.201983030 container init f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, release=1770267347, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=) Feb 20 02:39:46 localhost podman[32568]: 2026-02-20 07:39:46.971499113 +0000 UTC m=+0.212926431 container start f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, io.openshift.tags=rhceph ceph, release=1770267347, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.buildah.version=1.42.2, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 02:39:46 localhost podman[32568]: 2026-02-20 07:39:46.971860675 +0000 UTC m=+0.213288133 container attach f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Feb 20 02:39:46 localhost ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs mount Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 20 02:39:46 localhost ceph-osd[32226]: bluefs mount shared_bdev_used = 4718592 Feb 20 02:39:46 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: RocksDB version: 7.9.2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Git sha 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: DB SUMMARY Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: DB Session ID: B1RZZQUR7VWFY9T1SVAZ Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: CURRENT file: CURRENT Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: IDENTITY file: IDENTITY Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.error_if_exists: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.create_if_missing: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.paranoid_checks: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.env: 0x55bf8ddc4690 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.fs: LegacyFileSystem Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.info_log: 0x55bf8ed883c0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_file_opening_threads: 16 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.statistics: (nil) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_fsync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_log_file_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.log_file_time_to_roll: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.keep_log_file_num: 1000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.recycle_log_file_num: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_fallocate: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_mmap_reads: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_mmap_writes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_direct_reads: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.create_missing_column_families: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.db_log_dir: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_dir: db.wal Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_cache_numshardbits: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.advise_random_on_open: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.db_write_buffer_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_manager: 0x55bf8dd6b5e0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.use_adaptive_mutex: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.rate_limiter: (nil) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_recovery_mode: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_thread_tracking: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_pipelined_write: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.unordered_write: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.row_cache: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_ingest_behind: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.two_write_queues: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.manual_wal_flush: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_compression: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.atomic_flush: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.persist_stats_to_disk: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.log_readahead_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.best_efforts_recovery: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.allow_data_in_errors: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.db_host_id: __hostname__ Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.enforce_single_del_contracts: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_background_jobs: 4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_background_compactions: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_subcompactions: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.delayed_write_rate : 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.stats_dump_period_sec: 600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.stats_persist_period_sec: 600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_open_files: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bytes_per_sync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_background_flushes: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Compression algorithms supported: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kZSTD supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kXpressCompression supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kBZip2Compression supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kLZ4Compression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kZlibCompression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: #011kSnappyCompression supported: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: DMutex implementation: pthread_mutex_t Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:46 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd582d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed89980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd59610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed89980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd59610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.merge_operator: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed89980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55bf8dd59610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression: LZ4 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.num_levels: 7 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 58a199a0-9c9f-484d-9a27-dda744c2ce19 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187009304, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187015153, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573187, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "58a199a0-9c9f-484d-9a27-dda744c2ce19", "db_session_id": "B1RZZQUR7VWFY9T1SVAZ", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187019626, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573187, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "58a199a0-9c9f-484d-9a27-dda744c2ce19", "db_session_id": "B1RZZQUR7VWFY9T1SVAZ", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187023584, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573187, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "58a199a0-9c9f-484d-9a27-dda744c2ce19", "db_session_id": "B1RZZQUR7VWFY9T1SVAZ", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187028190, "job": 1, "event": "recovery_finished"} Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bf8de1a700 Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: DB pointer 0x55bf8ec73a00 Feb 20 02:39:47 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 20 02:39:47 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4 Feb 20 02:39:47 localhost ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 02:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usag Feb 20 02:39:47 localhost ceph-osd[32226]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Feb 20 02:39:47 localhost ceph-osd[32226]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Feb 20 02:39:47 localhost ceph-osd[32226]: _get_class not permitted to load lua Feb 20 02:39:47 localhost ceph-osd[32226]: _get_class not permitted to load sdk Feb 20 02:39:47 localhost ceph-osd[32226]: _get_class not permitted to load test_remote_reads Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 load_pgs Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 load_pgs opened 0 pgs Feb 20 02:39:47 localhost ceph-osd[32226]: osd.0 0 log_to_monitors true Feb 20 02:39:47 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0[32221]: 2026-02-20T07:39:47.067+0000 7fd036886a80 -1 osd.0 0 log_to_monitors true Feb 20 02:39:47 localhost systemd[1]: var-lib-containers-storage-overlay-88af1a01f39f8f316dcc9d82ed7d17047230f7f134add5e5b1e51207e3286387-merged.mount: Deactivated successfully. Feb 20 02:39:47 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test[32583]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Feb 20 02:39:47 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test[32583]: [--no-systemd] [--no-tmpfs] Feb 20 02:39:47 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test[32583]: ceph-volume activate: error: unrecognized arguments: --bad-option Feb 20 02:39:47 localhost systemd[1]: libpod-f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477.scope: Deactivated successfully. Feb 20 02:39:47 localhost podman[32568]: 2026-02-20 07:39:47.188809639 +0000 UTC m=+0.430236957 container died f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:47 localhost systemd[1]: var-lib-containers-storage-overlay-90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5-merged.mount: Deactivated successfully. Feb 20 02:39:47 localhost podman[32803]: 2026-02-20 07:39:47.262479178 +0000 UTC m=+0.067979602 container remove f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, version=7) Feb 20 02:39:47 localhost systemd[1]: libpod-conmon-f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477.scope: Deactivated successfully. Feb 20 02:39:47 localhost systemd[1]: Reloading. Feb 20 02:39:47 localhost systemd-sysv-generator[32857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:47 localhost systemd-rc-local-generator[32850]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:47 localhost systemd[1]: Reloading. Feb 20 02:39:47 localhost systemd-rc-local-generator[32900]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:39:47 localhost systemd-sysv-generator[32905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:39:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:39:48 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Feb 20 02:39:48 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Feb 20 02:39:48 localhost systemd[1]: Starting Ceph osd.3 for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 02:39:48 localhost sshd[32965]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:48 localhost podman[32959]: Feb 20 02:39:48 localhost podman[32959]: 2026-02-20 07:39:48.461268954 +0000 UTC m=+0.072977997 container create 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 20 02:39:48 localhost systemd[1]: tmp-crun.QEeBDS.mount: Deactivated successfully. Feb 20 02:39:48 localhost systemd[1]: Started libcrun container. Feb 20 02:39:48 localhost podman[32959]: 2026-02-20 07:39:48.432209607 +0000 UTC m=+0.043918650 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:48 localhost podman[32959]: 2026-02-20 07:39:48.594352083 +0000 UTC m=+0.206061136 container init 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc.) Feb 20 02:39:48 localhost podman[32959]: 2026-02-20 07:39:48.603087891 +0000 UTC m=+0.214796944 container start 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, GIT_CLEAN=True, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public) Feb 20 02:39:48 localhost podman[32959]: 2026-02-20 07:39:48.603291167 +0000 UTC m=+0.215000220 container attach 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 done with init, starting boot process Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 start_boot Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1 Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Feb 20 02:39:48 localhost ceph-osd[32226]: osd.0 0 bench count 12288000 bsize 4 KiB Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Feb 20 02:39:49 localhost bash[32959]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Feb 20 02:39:49 localhost bash[32959]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Feb 20 02:39:49 localhost bash[32959]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 20 02:39:49 localhost bash[32959]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:49 localhost bash[32959]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Feb 20 02:39:49 localhost bash[32959]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Feb 20 02:39:49 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: --> ceph-volume raw activate successful for osd ID: 3 Feb 20 02:39:49 localhost bash[32959]: --> ceph-volume raw activate successful for osd ID: 3 Feb 20 02:39:49 localhost systemd[1]: libpod-734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9.scope: Deactivated successfully. Feb 20 02:39:49 localhost podman[32959]: 2026-02-20 07:39:49.306229035 +0000 UTC m=+0.917938078 container died 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, io.openshift.expose-services=, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7) Feb 20 02:39:49 localhost podman[33095]: 2026-02-20 07:39:49.41073929 +0000 UTC m=+0.094895770 container remove 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, release=1770267347, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, name=rhceph) Feb 20 02:39:49 localhost systemd[1]: var-lib-containers-storage-overlay-73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925-merged.mount: Deactivated successfully. Feb 20 02:39:49 localhost podman[33159]: Feb 20 02:39:49 localhost podman[33159]: 2026-02-20 07:39:49.736729299 +0000 UTC m=+0.088112016 container create bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=1770267347, version=7, name=rhceph) Feb 20 02:39:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:49 localhost podman[33159]: 2026-02-20 07:39:49.705096916 +0000 UTC m=+0.056479693 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:49 localhost podman[33159]: 2026-02-20 07:39:49.834082889 +0000 UTC m=+0.185465636 container init bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 02:39:49 localhost podman[33159]: 2026-02-20 07:39:49.851194713 +0000 UTC m=+0.202577430 container start bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, build-date=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 02:39:49 localhost bash[33159]: bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 Feb 20 02:39:49 localhost systemd[1]: Started Ceph osd.3 for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 02:39:49 localhost ceph-osd[33177]: set uid:gid to 167:167 (ceph:ceph) Feb 20 02:39:49 localhost ceph-osd[33177]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2 Feb 20 02:39:49 localhost ceph-osd[33177]: pidfile_write: ignore empty --pid-file Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:49 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:49 localhost ceph-osd[33177]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Feb 20 02:39:49 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) close Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) close Feb 20 02:39:50 localhost ceph-osd[33177]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal Feb 20 02:39:50 localhost ceph-osd[33177]: load: jerasure load: lrc Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:50 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) close Feb 20 02:39:50 localhost podman[33264]: Feb 20 02:39:50 localhost podman[33264]: 2026-02-20 07:39:50.661060136 +0000 UTC m=+0.077328480 container create b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, vcs-type=git) Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:50 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) close Feb 20 02:39:50 localhost systemd[1]: Started libpod-conmon-b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862.scope. Feb 20 02:39:50 localhost podman[33264]: 2026-02-20 07:39:50.624725988 +0000 UTC m=+0.040994332 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:50 localhost systemd[1]: Started libcrun container. Feb 20 02:39:50 localhost podman[33264]: 2026-02-20 07:39:50.750289718 +0000 UTC m=+0.166558032 container init b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 02:39:50 localhost systemd[1]: tmp-crun.otriDV.mount: Deactivated successfully. Feb 20 02:39:50 localhost podman[33264]: 2026-02-20 07:39:50.76246215 +0000 UTC m=+0.178730474 container start b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1770267347, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7) Feb 20 02:39:50 localhost podman[33264]: 2026-02-20 07:39:50.762643106 +0000 UTC m=+0.178911500 container attach b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc.) Feb 20 02:39:50 localhost systemd[1]: libpod-b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862.scope: Deactivated successfully. Feb 20 02:39:50 localhost adoring_benz[33283]: 167 167 Feb 20 02:39:50 localhost podman[33264]: 2026-02-20 07:39:50.766749871 +0000 UTC m=+0.183018195 container died b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, release=1770267347) Feb 20 02:39:50 localhost podman[33288]: 2026-02-20 07:39:50.919075844 +0000 UTC m=+0.137617779 container remove b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.expose-services=, release=1770267347) Feb 20 02:39:50 localhost systemd[1]: libpod-conmon-b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862.scope: Deactivated successfully. Feb 20 02:39:50 localhost ceph-osd[33177]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Feb 20 02:39:50 localhost ceph-osd[33177]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:50 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:50 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:50 localhost ceph-osd[33177]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Feb 20 02:39:50 localhost ceph-osd[33177]: bluefs mount Feb 20 02:39:50 localhost ceph-osd[33177]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 20 02:39:50 localhost ceph-osd[33177]: bluefs mount shared_bdev_used = 0 Feb 20 02:39:50 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: RocksDB version: 7.9.2 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Git sha 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: DB SUMMARY Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: DB Session ID: O6R872HUA9PK85WK15LU Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: CURRENT file: CURRENT Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: IDENTITY file: IDENTITY Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.error_if_exists: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.create_if_missing: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.paranoid_checks: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.env: 0x55cba6c06cb0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.fs: LegacyFileSystem Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.info_log: 0x55cba7910d00 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.max_file_opening_threads: 16 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.statistics: (nil) Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.use_fsync: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.max_log_file_size: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.log_file_time_to_roll: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.keep_log_file_num: 1000 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.recycle_log_file_num: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.allow_fallocate: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.allow_mmap_reads: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.allow_mmap_writes: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.use_direct_reads: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.create_missing_column_families: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.db_log_dir: Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.wal_dir: db.wal Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.table_cache_numshardbits: 6 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.advise_random_on_open: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.db_write_buffer_size: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_manager: 0x55cba695c140 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.use_adaptive_mutex: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.rate_limiter: (nil) Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.wal_recovery_mode: 2 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.enable_thread_tracking: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.enable_pipelined_write: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.unordered_write: 0 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 20 02:39:50 localhost ceph-osd[33177]: rocksdb: Options.row_cache: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_ingest_behind: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.two_write_queues: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.manual_wal_flush: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_compression: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.atomic_flush: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.persist_stats_to_disk: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.log_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.best_efforts_recovery: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_data_in_errors: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.db_host_id: __hostname__ Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enforce_single_del_contracts: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_background_jobs: 4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_background_compactions: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_subcompactions: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.delayed_write_rate : 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.stats_dump_period_sec: 600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.stats_persist_period_sec: 600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_open_files: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bytes_per_sync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_background_flushes: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Compression algorithms supported: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kZSTD supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kXpressCompression supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kBZip2Compression supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kLZ4Compression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kZlibCompression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kSnappyCompression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: DMutex implementation: pthread_mutex_t Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba79110e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba79110e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba79110e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 06dd77e8-a884-4c07-8af2-d9bd01a9e776 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191000172, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191000358, "job": 1, "event": "recovery_finished"} Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000 Feb 20 02:39:51 localhost ceph-osd[33177]: freelist init Feb 20 02:39:51 localhost ceph-osd[33177]: freelist _read_cfg Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Feb 20 02:39:51 localhost ceph-osd[33177]: bluefs umount Feb 20 02:39:51 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) close Feb 20 02:39:51 localhost podman[33503]: Feb 20 02:39:51 localhost podman[33503]: 2026-02-20 07:39:51.12702054 +0000 UTC m=+0.076320448 container create a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-type=git, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.expose-services=, ceph=True) Feb 20 02:39:51 localhost podman[33503]: 2026-02-20 07:39:51.084132725 +0000 UTC m=+0.033432643 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:51 localhost systemd[1]: Started libpod-conmon-a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892.scope. Feb 20 02:39:51 localhost systemd[1]: Started libcrun container. Feb 20 02:39:51 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Feb 20 02:39:51 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Feb 20 02:39:51 localhost ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 20 02:39:51 localhost ceph-osd[33177]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Feb 20 02:39:51 localhost ceph-osd[33177]: bluefs mount Feb 20 02:39:51 localhost ceph-osd[33177]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 20 02:39:51 localhost ceph-osd[33177]: bluefs mount shared_bdev_used = 4718592 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: RocksDB version: 7.9.2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Git sha 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: DB SUMMARY Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: DB Session ID: O6R872HUA9PK85WK15LV Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: CURRENT file: CURRENT Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: IDENTITY file: IDENTITY Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.error_if_exists: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.create_if_missing: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.env: 0x55cba6a98690 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.fs: LegacyFileSystem Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.info_log: 0x55cba79288a0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_file_opening_threads: 16 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.statistics: (nil) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.use_fsync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_log_file_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.log_file_time_to_roll: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.keep_log_file_num: 1000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.recycle_log_file_num: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_fallocate: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_mmap_reads: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_mmap_writes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.use_direct_reads: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.create_missing_column_families: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.db_log_dir: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_dir: db.wal Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_cache_numshardbits: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.advise_random_on_open: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.db_write_buffer_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_manager: 0x55cba695d5e0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.use_adaptive_mutex: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.rate_limiter: (nil) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_recovery_mode: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_thread_tracking: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_pipelined_write: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.unordered_write: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.row_cache: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_ingest_behind: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.two_write_queues: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.manual_wal_flush: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_compression: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.atomic_flush: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.persist_stats_to_disk: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.log_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.best_efforts_recovery: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.allow_data_in_errors: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.db_host_id: __hostname__ Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enforce_single_del_contracts: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_background_jobs: 4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_background_compactions: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_subcompactions: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.delayed_write_rate : 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.stats_dump_period_sec: 600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.stats_persist_period_sec: 600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_open_files: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bytes_per_sync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_background_flushes: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Compression algorithms supported: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kZSTD supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kXpressCompression supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kBZip2Compression supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kLZ4Compression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kZlibCompression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: #011kSnappyCompression supported: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: DMutex implementation: pthread_mutex_t Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7929920)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7929920)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.merge_operator: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_filter_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.sst_partitioner_factory: None Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7929920)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55cba694b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.write_buffer_size: 16777216 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number: 64 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression: LZ4 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression: Disabled Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.num_levels: 7 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.level: 32767 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.enabled: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.arena_block_size: 1048576 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_support: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.bloom_locality: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.max_successive_merges: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.force_consistency_checks: 1 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.ttl: 2592000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_files: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.min_blob_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_size: 268435456 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 06dd77e8-a884-4c07-8af2-d9bd01a9e776 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191283678, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191297060, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573191, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06dd77e8-a884-4c07-8af2-d9bd01a9e776", "db_session_id": "O6R872HUA9PK85WK15LV", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191309314, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573191, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06dd77e8-a884-4c07-8af2-d9bd01a9e776", "db_session_id": "O6R872HUA9PK85WK15LV", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 20 02:39:51 localhost podman[33503]: 2026-02-20 07:39:51.30991227 +0000 UTC m=+0.259212168 container init a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191315169, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573191, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06dd77e8-a884-4c07-8af2-d9bd01a9e776", "db_session_id": "O6R872HUA9PK85WK15LV", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Feb 20 02:39:51 localhost podman[33503]: 2026-02-20 07:39:51.322513596 +0000 UTC m=+0.271813484 container start a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git) Feb 20 02:39:51 localhost podman[33503]: 2026-02-20 07:39:51.325437533 +0000 UTC m=+0.274737421 container attach a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , version=7, release=1770267347, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, ceph=True) Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191327512, "job": 1, "event": "recovery_finished"} Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cba69b2700 Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: DB pointer 0x55cba786fa00 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4 Feb 20 02:39:51 localhost ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 02:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usag Feb 20 02:39:51 localhost ceph-osd[33177]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Feb 20 02:39:51 localhost ceph-osd[33177]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Feb 20 02:39:51 localhost ceph-osd[33177]: _get_class not permitted to load lua Feb 20 02:39:51 localhost ceph-osd[33177]: _get_class not permitted to load sdk Feb 20 02:39:51 localhost ceph-osd[33177]: _get_class not permitted to load test_remote_reads Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 load_pgs Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 load_pgs opened 0 pgs Feb 20 02:39:51 localhost ceph-osd[33177]: osd.3 0 log_to_monitors true Feb 20 02:39:51 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3[33173]: 2026-02-20T07:39:51.374+0000 7fcb116d6a80 -1 osd.3 0 log_to_monitors true Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 16.388 iops: 4195.397 elapsed_sec: 0.715 Feb 20 02:39:51 localhost ceph-osd[32226]: log_channel(cluster) log [WRN] : OSD bench result of 4195.396697 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 0 waiting for initial osdmap Feb 20 02:39:51 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0[32221]: 2026-02-20T07:39:51.380+0000 7fd032805640 -1 osd.0 0 waiting for initial osdmap Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 crush map has features 288514050185494528, adjusting msgr requires for clients Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 crush map has features 3314932999778484224, adjusting msgr requires for osds Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 check_osdmap_features require_osd_release unknown -> reef Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 set_numa_affinity not setting numa affinity Feb 20 02:39:51 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0[32221]: 2026-02-20T07:39:51.395+0000 7fd02de2f640 -1 osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 10 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Feb 20 02:39:51 localhost systemd[1]: var-lib-containers-storage-overlay-cf8624e117190d60c41c582dbf41f5a1c10e803c6c47b932fd4c755d9c01a0e2-merged.mount: Deactivated successfully. Feb 20 02:39:51 localhost ceph-osd[32226]: osd.0 11 state: booting -> active Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: { Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "1635aa65-16b7-4b42-b3ab-efa9a5fbb750": { Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "ceph_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8", Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "osd_id": 3, Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "osd_uuid": "1635aa65-16b7-4b42-b3ab-efa9a5fbb750", Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "type": "bluestore" Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: }, Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "246e60bc-5fa8-45c8-b746-372a7c540a58": { Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "ceph_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8", Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "osd_id": 0, Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "osd_uuid": "246e60bc-5fa8-45c8-b746-372a7c540a58", Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: "type": "bluestore" Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: } Feb 20 02:39:51 localhost vibrant_dubinsky[33519]: } Feb 20 02:39:51 localhost systemd[1]: libpod-a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892.scope: Deactivated successfully. Feb 20 02:39:51 localhost podman[33503]: 2026-02-20 07:39:51.922960094 +0000 UTC m=+0.872259982 container died a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, version=7, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main) Feb 20 02:39:51 localhost systemd[1]: tmp-crun.zo0Sm6.mount: Deactivated successfully. Feb 20 02:39:52 localhost systemd[1]: var-lib-containers-storage-overlay-59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6-merged.mount: Deactivated successfully. Feb 20 02:39:52 localhost podman[33773]: 2026-02-20 07:39:52.025144893 +0000 UTC m=+0.091303381 container remove a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7) Feb 20 02:39:52 localhost systemd[1]: libpod-conmon-a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892.scope: Deactivated successfully. Feb 20 02:39:52 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Feb 20 02:39:52 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 done with init, starting boot process Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 start_boot Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1 Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Feb 20 02:39:52 localhost ceph-osd[33177]: osd.3 0 bench count 12288000 bsize 4 KiB Feb 20 02:39:53 localhost ceph-osd[32226]: osd.0 13 crush map has features 288514051259236352, adjusting msgr requires for clients Feb 20 02:39:53 localhost ceph-osd[32226]: osd.0 13 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Feb 20 02:39:53 localhost ceph-osd[32226]: osd.0 13 crush map has features 3314933000852226048, adjusting msgr requires for osds Feb 20 02:39:54 localhost podman[33901]: 2026-02-20 07:39:54.036007285 +0000 UTC m=+0.089001776 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, release=1770267347, version=7, CEPH_POINT_RELEASE=) Feb 20 02:39:54 localhost podman[33901]: 2026-02-20 07:39:54.138439523 +0000 UTC m=+0.191434044 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=) Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 31.913 iops: 8169.697 elapsed_sec: 0.367 Feb 20 02:39:55 localhost ceph-osd[33177]: log_channel(cluster) log [WRN] : OSD bench result of 8169.696602 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 0 waiting for initial osdmap Feb 20 02:39:55 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3[33173]: 2026-02-20T07:39:55.141+0000 7fcb0de6a640 -1 osd.3 0 waiting for initial osdmap Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 crush map has features 288514051259236352, adjusting msgr requires for clients Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 crush map has features 3314933000852226048, adjusting msgr requires for osds Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 check_osdmap_features require_osd_release unknown -> reef Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 set_numa_affinity not setting numa affinity Feb 20 02:39:55 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3[33173]: 2026-02-20T07:39:55.159+0000 7fcb08c7f640 -1 osd.3 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 14 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Feb 20 02:39:55 localhost sshd[34062]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:39:55 localhost ceph-osd[33177]: osd.3 15 state: booting -> active Feb 20 02:39:56 localhost podman[34100]: Feb 20 02:39:56 localhost podman[34100]: 2026-02-20 07:39:56.029541606 +0000 UTC m=+0.059467091 container create 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , release=1770267347, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True) Feb 20 02:39:56 localhost systemd[1]: Started libpod-conmon-29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264.scope. Feb 20 02:39:56 localhost systemd[1]: Started libcrun container. Feb 20 02:39:56 localhost podman[34100]: 2026-02-20 07:39:56.101033533 +0000 UTC m=+0.130959018 container init 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vendor=Red Hat, Inc., release=1770267347, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True) Feb 20 02:39:56 localhost podman[34100]: 2026-02-20 07:39:56.01025445 +0000 UTC m=+0.040179925 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:56 localhost podman[34100]: 2026-02-20 07:39:56.111157867 +0000 UTC m=+0.141083362 container start 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vcs-type=git, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.42.2, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 02:39:56 localhost wizardly_blackburn[34116]: 167 167 Feb 20 02:39:56 localhost systemd[1]: libpod-29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264.scope: Deactivated successfully. Feb 20 02:39:56 localhost podman[34100]: 2026-02-20 07:39:56.111416146 +0000 UTC m=+0.141341681 container attach 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, GIT_CLEAN=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347) Feb 20 02:39:56 localhost podman[34100]: 2026-02-20 07:39:56.118012813 +0000 UTC m=+0.147938298 container died 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 02:39:56 localhost systemd[1]: var-lib-containers-storage-overlay-950dc5cbe605b0669c02a8a224b7d97add694902b56109e9ba2a7af24623ea27-merged.mount: Deactivated successfully. Feb 20 02:39:56 localhost podman[34121]: 2026-02-20 07:39:56.204460033 +0000 UTC m=+0.074951842 container remove 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 20 02:39:56 localhost systemd[1]: libpod-conmon-29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264.scope: Deactivated successfully. Feb 20 02:39:56 localhost podman[34143]: Feb 20 02:39:56 localhost podman[34143]: 2026-02-20 07:39:56.391377147 +0000 UTC m=+0.060435724 container create f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, build-date=2026-02-09T10:25:24Z) Feb 20 02:39:56 localhost systemd[1]: Started libpod-conmon-f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39.scope. Feb 20 02:39:56 localhost systemd[1]: Started libcrun container. Feb 20 02:39:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:56 localhost podman[34143]: 2026-02-20 07:39:56.3614511 +0000 UTC m=+0.030509717 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 02:39:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 02:39:56 localhost podman[34143]: 2026-02-20 07:39:56.477895469 +0000 UTC m=+0.146954096 container init f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, distribution-scope=public, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, version=7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=) Feb 20 02:39:56 localhost podman[34143]: 2026-02-20 07:39:56.488738887 +0000 UTC m=+0.157797474 container start f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 02:39:56 localhost podman[34143]: 2026-02-20 07:39:56.489110689 +0000 UTC m=+0.158169286 container attach f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z, release=1770267347, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.component=rhceph-container) Feb 20 02:39:57 localhost optimistic_black[34159]: [ Feb 20 02:39:57 localhost optimistic_black[34159]: { Feb 20 02:39:57 localhost optimistic_black[34159]: "available": false, Feb 20 02:39:57 localhost optimistic_black[34159]: "ceph_device": false, Feb 20 02:39:57 localhost optimistic_black[34159]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 20 02:39:57 localhost optimistic_black[34159]: "lsm_data": {}, Feb 20 02:39:57 localhost optimistic_black[34159]: "lvs": [], Feb 20 02:39:57 localhost optimistic_black[34159]: "path": "/dev/sr0", Feb 20 02:39:57 localhost optimistic_black[34159]: "rejected_reasons": [ Feb 20 02:39:57 localhost optimistic_black[34159]: "Has a FileSystem", Feb 20 02:39:57 localhost optimistic_black[34159]: "Insufficient space (<5GB)" Feb 20 02:39:57 localhost optimistic_black[34159]: ], Feb 20 02:39:57 localhost optimistic_black[34159]: "sys_api": { Feb 20 02:39:57 localhost optimistic_black[34159]: "actuators": null, Feb 20 02:39:57 localhost optimistic_black[34159]: "device_nodes": "sr0", Feb 20 02:39:57 localhost optimistic_black[34159]: "human_readable_size": "482.00 KB", Feb 20 02:39:57 localhost optimistic_black[34159]: "id_bus": "ata", Feb 20 02:39:57 localhost optimistic_black[34159]: "model": "QEMU DVD-ROM", Feb 20 02:39:57 localhost optimistic_black[34159]: "nr_requests": "2", Feb 20 02:39:57 localhost optimistic_black[34159]: "partitions": {}, Feb 20 02:39:57 localhost optimistic_black[34159]: "path": "/dev/sr0", Feb 20 02:39:57 localhost optimistic_black[34159]: "removable": "1", Feb 20 02:39:57 localhost optimistic_black[34159]: "rev": "2.5+", Feb 20 02:39:57 localhost optimistic_black[34159]: "ro": "0", Feb 20 02:39:57 localhost optimistic_black[34159]: "rotational": "1", Feb 20 02:39:57 localhost optimistic_black[34159]: "sas_address": "", Feb 20 02:39:57 localhost optimistic_black[34159]: "sas_device_handle": "", Feb 20 02:39:57 localhost optimistic_black[34159]: "scheduler_mode": "mq-deadline", Feb 20 02:39:57 localhost optimistic_black[34159]: "sectors": 0, Feb 20 02:39:57 localhost optimistic_black[34159]: "sectorsize": "2048", Feb 20 02:39:57 localhost optimistic_black[34159]: "size": 493568.0, Feb 20 02:39:57 localhost optimistic_black[34159]: "support_discard": "0", Feb 20 02:39:57 localhost optimistic_black[34159]: "type": "disk", Feb 20 02:39:57 localhost optimistic_black[34159]: "vendor": "QEMU" Feb 20 02:39:57 localhost optimistic_black[34159]: } Feb 20 02:39:57 localhost optimistic_black[34159]: } Feb 20 02:39:57 localhost optimistic_black[34159]: ] Feb 20 02:39:57 localhost systemd[1]: libpod-f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39.scope: Deactivated successfully. Feb 20 02:39:57 localhost podman[34143]: 2026-02-20 07:39:57.313225472 +0000 UTC m=+0.982284089 container died f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.42.2, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, ceph=True, build-date=2026-02-09T10:25:24Z, RELEASE=main, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 02:39:57 localhost systemd[1]: tmp-crun.4cMZNW.mount: Deactivated successfully. Feb 20 02:39:57 localhost systemd[1]: var-lib-containers-storage-overlay-f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e-merged.mount: Deactivated successfully. Feb 20 02:39:57 localhost podman[35633]: 2026-02-20 07:39:57.401453971 +0000 UTC m=+0.079817773 container remove f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, release=1770267347, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main) Feb 20 02:39:57 localhost systemd[1]: libpod-conmon-f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39.scope: Deactivated successfully. Feb 20 02:39:57 localhost ceph-osd[33177]: osd.3 pg_epoch: 15 pg[1.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=15) [1,5,3] r=2 lpr=15 pi=[13,15)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:40:03 localhost sshd[35662]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:06 localhost podman[35762]: 2026-02-20 07:40:06.35963873 +0000 UTC m=+0.075651105 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, release=1770267347, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 02:40:06 localhost podman[35762]: 2026-02-20 07:40:06.432064718 +0000 UTC m=+0.148077133 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.42.2, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 02:40:08 localhost sshd[35842]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:08 localhost sshd[35843]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:15 localhost sshd[35847]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:18 localhost sshd[35849]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:25 localhost sshd[35851]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:34 localhost sshd[35853]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:40 localhost sshd[35855]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:43 localhost sshd[35857]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:49 localhost sshd[35859]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:40:56 localhost systemd[26592]: Starting Mark boot as successful... Feb 20 02:40:56 localhost systemd[26592]: Finished Mark boot as successful. Feb 20 02:40:57 localhost sshd[35862]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:05 localhost sshd[35864]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:08 localhost podman[35964]: 2026-02-20 07:41:08.262277884 +0000 UTC m=+0.083647745 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, ceph=True, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347) Feb 20 02:41:08 localhost podman[35964]: 2026-02-20 07:41:08.395382946 +0000 UTC m=+0.216752797 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc.) Feb 20 02:41:12 localhost sshd[36106]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:18 localhost systemd[1]: session-14.scope: Deactivated successfully. Feb 20 02:41:18 localhost systemd[1]: session-14.scope: Consumed 21.687s CPU time. Feb 20 02:41:18 localhost systemd-logind[759]: Session 14 logged out. Waiting for processes to exit. Feb 20 02:41:18 localhost systemd-logind[759]: Removed session 14. Feb 20 02:41:23 localhost sshd[36108]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:32 localhost sshd[36110]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:48 localhost sshd[36112]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:54 localhost sshd[36114]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:55 localhost sshd[36116]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:41:57 localhost sshd[36118]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:05 localhost sshd[36120]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:11 localhost sshd[36183]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:19 localhost sshd[36200]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:27 localhost sshd[36202]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:35 localhost sshd[36204]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:43 localhost sshd[36206]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:45 localhost sshd[36208]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:49 localhost sshd[36210]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:42:56 localhost sshd[36212]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:02 localhost sshd[36214]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:09 localhost sshd[36216]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:12 localhost sshd[36266]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:20 localhost sshd[36297]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:27 localhost sshd[36299]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:34 localhost sshd[36301]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:35 localhost sshd[36303]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:41 localhost sshd[36305]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:49 localhost sshd[36307]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:43:56 localhost systemd[26592]: Created slice User Background Tasks Slice. Feb 20 02:43:56 localhost systemd[26592]: Starting Cleanup of User's Temporary Files and Directories... Feb 20 02:43:56 localhost systemd[26592]: Finished Cleanup of User's Temporary Files and Directories. Feb 20 02:43:58 localhost sshd[36310]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:04 localhost sshd[36312]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:06 localhost sshd[36314]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:13 localhost sshd[36346]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:18 localhost sshd[36395]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:21 localhost sshd[36397]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:26 localhost sshd[36399]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:35 localhost sshd[36401]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:39 localhost sshd[36403]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:40 localhost sshd[36405]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:46 localhost sshd[36407]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:49 localhost sshd[36409]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:49 localhost systemd-logind[759]: New session 28 of user zuul. Feb 20 02:44:49 localhost systemd[1]: Started Session 28 of User zuul. Feb 20 02:44:50 localhost python3[36457]: ansible-ansible.legacy.ping Invoked with data=pong Feb 20 02:44:50 localhost sshd[36502]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:44:51 localhost python3[36503]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 02:44:51 localhost python3[36523]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 20 02:44:52 localhost python3[36579]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:44:52 localhost python3[36622]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771573491.7957625-66446-198631752352309/source _original_basename=tmp_6yb4k5j follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:44:52 localhost python3[36653]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:44:53 localhost python3[36669]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:44:53 localhost python3[36685]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:44:54 localhost python3[36701]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:44:55 localhost python3[36715]: ansible-ping Invoked with data=pong Feb 20 02:44:58 localhost sshd[36716]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:04 localhost sshd[36718]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:06 localhost sshd[36720]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:06 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 20 02:45:06 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 20 02:45:06 localhost systemd-logind[759]: New session 29 of user tripleo-admin. Feb 20 02:45:06 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 20 02:45:06 localhost systemd[1]: Starting User Manager for UID 1003... Feb 20 02:45:06 localhost systemd[36724]: Queued start job for default target Main User Target. Feb 20 02:45:06 localhost systemd[36724]: Created slice User Application Slice. Feb 20 02:45:06 localhost systemd[36724]: Started Mark boot as successful after the user session has run 2 minutes. Feb 20 02:45:06 localhost systemd[36724]: Started Daily Cleanup of User's Temporary Directories. Feb 20 02:45:06 localhost systemd[36724]: Reached target Paths. Feb 20 02:45:06 localhost systemd[36724]: Reached target Timers. Feb 20 02:45:06 localhost systemd[36724]: Starting D-Bus User Message Bus Socket... Feb 20 02:45:06 localhost systemd[36724]: Starting Create User's Volatile Files and Directories... Feb 20 02:45:06 localhost systemd[36724]: Listening on D-Bus User Message Bus Socket. Feb 20 02:45:06 localhost systemd[36724]: Reached target Sockets. Feb 20 02:45:06 localhost systemd[36724]: Finished Create User's Volatile Files and Directories. Feb 20 02:45:06 localhost systemd[36724]: Reached target Basic System. Feb 20 02:45:06 localhost systemd[36724]: Reached target Main User Target. Feb 20 02:45:06 localhost systemd[36724]: Startup finished in 123ms. Feb 20 02:45:06 localhost systemd[1]: Started User Manager for UID 1003. Feb 20 02:45:06 localhost systemd[1]: Started Session 29 of User tripleo-admin. Feb 20 02:45:07 localhost python3[36785]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Feb 20 02:45:08 localhost sshd[36790]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:10 localhost sshd[36792]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:12 localhost python3[36808]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Feb 20 02:45:12 localhost python3[36825]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Feb 20 02:45:13 localhost python3[36873]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.tazbhho9tmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:45:13 localhost python3[36903]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.tazbhho9tmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:45:15 localhost python3[36949]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.tazbhho9tmphosts insertbefore=BOF block=172.17.0.106 np0005625202.localdomain np0005625202#012172.18.0.106 np0005625202.storage.localdomain np0005625202.storage#012172.20.0.106 np0005625202.storagemgmt.localdomain np0005625202.storagemgmt#012172.17.0.106 np0005625202.internalapi.localdomain np0005625202.internalapi#012172.19.0.106 np0005625202.tenant.localdomain np0005625202.tenant#012192.168.122.106 np0005625202.ctlplane.localdomain np0005625202.ctlplane#012172.17.0.107 np0005625203.localdomain np0005625203#012172.18.0.107 np0005625203.storage.localdomain np0005625203.storage#012172.20.0.107 np0005625203.storagemgmt.localdomain np0005625203.storagemgmt#012172.17.0.107 np0005625203.internalapi.localdomain np0005625203.internalapi#012172.19.0.107 np0005625203.tenant.localdomain np0005625203.tenant#012192.168.122.107 np0005625203.ctlplane.localdomain np0005625203.ctlplane#012172.17.0.108 np0005625204.localdomain np0005625204#012172.18.0.108 np0005625204.storage.localdomain np0005625204.storage#012172.20.0.108 np0005625204.storagemgmt.localdomain np0005625204.storagemgmt#012172.17.0.108 np0005625204.internalapi.localdomain np0005625204.internalapi#012172.19.0.108 np0005625204.tenant.localdomain np0005625204.tenant#012192.168.122.108 np0005625204.ctlplane.localdomain np0005625204.ctlplane#012172.17.0.103 np0005625199.localdomain np0005625199#012172.18.0.103 np0005625199.storage.localdomain np0005625199.storage#012172.20.0.103 np0005625199.storagemgmt.localdomain np0005625199.storagemgmt#012172.17.0.103 np0005625199.internalapi.localdomain np0005625199.internalapi#012172.19.0.103 np0005625199.tenant.localdomain np0005625199.tenant#012192.168.122.103 np0005625199.ctlplane.localdomain np0005625199.ctlplane#012172.17.0.104 np0005625200.localdomain np0005625200#012172.18.0.104 np0005625200.storage.localdomain np0005625200.storage#012172.20.0.104 np0005625200.storagemgmt.localdomain np0005625200.storagemgmt#012172.17.0.104 np0005625200.internalapi.localdomain np0005625200.internalapi#012172.19.0.104 np0005625200.tenant.localdomain np0005625200.tenant#012192.168.122.104 np0005625200.ctlplane.localdomain np0005625200.ctlplane#012172.17.0.105 np0005625201.localdomain np0005625201#012172.18.0.105 np0005625201.storage.localdomain np0005625201.storage#012172.20.0.105 np0005625201.storagemgmt.localdomain np0005625201.storagemgmt#012172.17.0.105 np0005625201.internalapi.localdomain np0005625201.internalapi#012172.19.0.105 np0005625201.tenant.localdomain np0005625201.tenant#012192.168.122.105 np0005625201.ctlplane.localdomain np0005625201.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.217 overcloud.storage.localdomain#012172.20.0.250 overcloud.storagemgmt.localdomain#012172.17.0.130 overcloud.internalapi.localdomain#012172.21.0.142 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:45:15 localhost python3[36997]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.tazbhho9tmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:45:15 localhost python3[37014]: ansible-file Invoked with path=/tmp/ansible.tazbhho9tmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:45:16 localhost python3[37045]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:45:17 localhost python3[37062]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:45:19 localhost sshd[37064]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:22 localhost python3[37083]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:45:22 localhost python3[37100]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:45:25 localhost sshd[37102]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:29 localhost sshd[37173]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:39 localhost sshd[37293]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:46 localhost sshd[37590]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:50 localhost sshd[37609]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:55 localhost sshd[37636]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:45:58 localhost sshd[37657]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:03 localhost sshd[37694]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:08 localhost sshd[37729]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:16 localhost sshd[37768]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:23 localhost sshd[37854]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:29 localhost kernel: SELinux: Converting 2700 SID table entries... Feb 20 02:46:29 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:46:29 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:46:29 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:46:29 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:46:29 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:46:29 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:46:29 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:46:29 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=6 res=1 Feb 20 02:46:30 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:46:30 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 02:46:30 localhost systemd[1]: Reloading. Feb 20 02:46:30 localhost systemd-rc-local-generator[37993]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:46:30 localhost systemd-sysv-generator[37996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:46:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:46:30 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 02:46:30 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 02:46:30 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 02:46:30 localhost systemd[1]: run-ra020798fb2eb49aaaa2a151af8915f20.service: Deactivated successfully. Feb 20 02:46:31 localhost sshd[38413]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:33 localhost python3[38430]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:35 localhost python3[38569]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:46:35 localhost systemd[1]: Reloading. Feb 20 02:46:35 localhost systemd-rc-local-generator[38595]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:46:35 localhost systemd-sysv-generator[38600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:46:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:46:37 localhost python3[38623]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:46:37 localhost python3[38639]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:38 localhost python3[38656]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 20 02:46:38 localhost python3[38674]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:46:39 localhost python3[38692]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:46:39 localhost python3[38710]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:46:39 localhost systemd[1]: Reloading Network Manager... Feb 20 02:46:39 localhost NetworkManager[5988]: [1771573599.7518] audit: op="reload" arg="0" pid=38713 uid=0 result="success" Feb 20 02:46:39 localhost NetworkManager[5988]: [1771573599.7528] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Feb 20 02:46:39 localhost NetworkManager[5988]: [1771573599.7528] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Feb 20 02:46:39 localhost systemd[1]: Reloaded Network Manager. Feb 20 02:46:40 localhost python3[38729]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:40 localhost python3[38746]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:46:40 localhost sshd[38749]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:41 localhost python3[38765]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:46:41 localhost python3[38781]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:46:42 localhost python3[38797]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Feb 20 02:46:42 localhost python3[38813]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:46:43 localhost python3[38830]: ansible-blockinfile Invoked with path=/tmp/ansible.j_w9c8fi block=[192.168.122.106]*,[np0005625202.ctlplane.localdomain]*,[172.17.0.106]*,[np0005625202.internalapi.localdomain]*,[172.18.0.106]*,[np0005625202.storage.localdomain]*,[172.20.0.106]*,[np0005625202.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005625202.tenant.localdomain]*,[np0005625202.localdomain]*,[np0005625202]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDr8sejencX7nSCX6AegGtTuiZL3yclu/L7ZVN4B6dKPdmHqVr33QJD40sEk28GHpx8BrkPU2Qj1de9H6mGtrlwhmJr7Pccg/YqzKoTCQD5rZQ4youU8H70As6YX5ZlXyulwI1SH70XjMm37x4ptKALFOjRnHg0WIXah/tAmzrY/orh+/eCcns7APVjN9B1o+MqP4r47WrWrGU/KxtsHc6dflWxZW7BWUCCNS0e3C4yWLRjy8Hhj7Qkpssv/UBcj+olVHadUUOYiaQZ5Y33MjxwIg8o1MuC7C1dNIn8eXOXXiA8jd/lJd9kImrCGUtkVqj8VQgsMh4vRYMD+0SNLYRDVwxdemOzJYgwQhgiWZ0G+cVhnTBpMmXyIws2OpOKU8R3HjTC3jz+BxvjwEvMDoQfpGgsHB9NCXnkQzs2F8EA8LpA823Ef1SMgPdDCaQzvN5oQPZkWAPMVHvq31xpN9q+KXg/bg0uDaIZXUxW2rGnem7pFS78rRUGL6MfSMn1zs=#012[192.168.122.107]*,[np0005625203.ctlplane.localdomain]*,[172.17.0.107]*,[np0005625203.internalapi.localdomain]*,[172.18.0.107]*,[np0005625203.storage.localdomain]*,[172.20.0.107]*,[np0005625203.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005625203.tenant.localdomain]*,[np0005625203.localdomain]*,[np0005625203]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtf1NXQ3EGQGdpLLLxuODKBdTGwqsiHL2QZ6zcfpGAa7EhDIxuEcLboqOGjQO0FM3u+kl2gIgKF0UsY5Vjcv4mDCMp7A7srq7TVo5lE5cCppbbXr0/PH2L/naHU3W+W83aT5RE17XPJ0Acn3W51WFBoICCCc4jjWTGmkNEgurKBJmdr0n8NeIcUWZ7Abrs/N2xzNftEFIjAPwebxgEwgCx0hMbdjTFhKbB/V7CjKaCU/UjirWMW5aDQJQEfrCM9u4NHuGaWKzJgar4/shNHaRvkCDbVrRPTCyfNebE04J/R42X3yWmvww4TMZVpRROd/u6Pgg1P2tbPGfQ0XvS0rfY6W4/VnHcyRDqxILH5BoeCAbTuVFmR0hbQu9fNbNxTP+o+na9mHEbNxbhcREnkal8+M0l11YftCRkr4132JITxe7y93gN/dwxE3nJLHLXRuRskWc3GTDT2MVU2Sj64yizD9KOM3oiMBXdPbNbgZywu3hqQvpO00GVg6QRjEJoiFc=#012[192.168.122.108]*,[np0005625204.ctlplane.localdomain]*,[172.17.0.108]*,[np0005625204.internalapi.localdomain]*,[172.18.0.108]*,[np0005625204.storage.localdomain]*,[172.20.0.108]*,[np0005625204.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005625204.tenant.localdomain]*,[np0005625204.localdomain]*,[np0005625204]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAo6exxFtNk/Y5qEGYenJyhnCsS7iZmCGsFaQtJElNSeTTX9a1P0P2EmjtHolRxnljCZ2X8HgWx/irhJvWLoS+dzF5l+KcyQy83+048h51mbnj7zV2uG9i8LkO0egs1uBBp5E+hauHMsuf0nIDFl45W86ZXuf+MfFEKCInhjB5gfE9tTjwmKwKhgO1DE7Vpx3OYy1FHkq0YDBCqQHuuhYPrLZPjfVv3vGOaHH/XCsxX3h8/ixsZbobD56dDBKF/8CFyC/guH8pNUhZHG0dEhz5BT8PcE2Q/M9pPttzmRQksfg9+q7lVy9eCoOVpzqfTgjE1cm5yISwuMZzaNxwjJKB54EWpfl5xxnkC14B+xdvowxpl1PcMNZ0q1fWofJF4TrJAwWCUYZf45aUV2yb5R8WavUT0pX32xmd4zFbXusoafiw2FcgnxoGz3N4ZgIxTPPmgUe13blr1SK44huXWPioaolFBo82xVVFHc+01vfLF3xvs86d6EpqpLH+yaCeUjE=#012[192.168.122.103]*,[np0005625199.ctlplane.localdomain]*,[172.17.0.103]*,[np0005625199.internalapi.localdomain]*,[172.18.0.103]*,[np0005625199.storage.localdomain]*,[172.20.0.103]*,[np0005625199.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005625199.tenant.localdomain]*,[np0005625199.localdomain]*,[np0005625199]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrnsozeOPJKYg9sx2Tj6QOLRhujK5RVh5RZQ3sb0pk+DbWHQKqS1YvJUg2hV4WxbxPnNUCBtJ+RZ8lVm6RLM+hc3ffe2sOMOz5upO/hTlIpBSfJpQORkiNW+XIXdDVxgE418veFd2hASFmiCmKoFSKXsvnmFU9oTEpja1plcXSqCobFMVYKlhcRo66O0ySlGOR+o3Ar2yNJQjFErEGvZLoDEa/VlA6zreYmTaIsnlUDie0gbm5teTlsCcEYkvWcTzcfOEX2kXQRQbS5qlPtGg7c+KMv5e40rE+2QOigLmOOPVGwNYuLuhb/EHT0C8hK8otW4tiXxBlSZ5ONKY6YYQOpy7krNkWRxNXzK0LfXo2bt2apDaMzebPOvuBj1YyBiLpa6/aLvS/dtGolQNPDpFivPbP/mSpat1qTs0W3/2HyBovwWSGJDW8MMYxbZJ0Z6tnuOwdrPTdkhIibfW9wxgL7EHrDYrGx5CvA2vUM4KDKRntz/cCMGE/zKacSJ48nNk=#012[192.168.122.104]*,[np0005625200.ctlplane.localdomain]*,[172.17.0.104]*,[np0005625200.internalapi.localdomain]*,[172.18.0.104]*,[np0005625200.storage.localdomain]*,[172.20.0.104]*,[np0005625200.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005625200.tenant.localdomain]*,[np0005625200.localdomain]*,[np0005625200]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDW88346W6zU6nxCpqapHtIr5nRG8Jn9LFit3r5klBfauCkmAGONb4X8IwKjo8MD9etebUVbo6aX9gBMBMSs7bSoHzsEQuMLpBDrweSbahQj+gqZ5TmQ/xvwbhws04z3/IJxapAk2xWu7khVGjvOPUE1CROkP+1LiGktQ6Xj1ar1TbLNud2Dq/R5ZalbpK0OT3+no3x0oAJT3W649tW4nmCWcNaxykPsLREsUlH2qVoceAzLEDCSde9/1TONc/URyB4acVqmEwJDHeX51bh31tpQwp/WSe0vKQ6eUw63Tmpn+dRI9xbnFhc6mgGAPcEw7cAUkM7oM6bYMSvVxYDmzMhuXUU/9i3mdMnDBkMyZ5Oed6ZSmFQIJe5k7cz3783d35ZXfl/HsYMqoZ3lmDgbeS59pQrI+BldKyv3sTnoCDahfcmzmiHssxqa7tT5KOuR444q7Nj6wJEIZMEEJEHtMlh1iSBRJZOEOaKjo7h+jV7KMe75aPRasvu9K1v0dqyG6U=#012[192.168.122.105]*,[np0005625201.ctlplane.localdomain]*,[172.17.0.105]*,[np0005625201.internalapi.localdomain]*,[172.18.0.105]*,[np0005625201.storage.localdomain]*,[172.20.0.105]*,[np0005625201.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005625201.tenant.localdomain]*,[np0005625201.localdomain]*,[np0005625201]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyGkX26ECIsvqnvJegedSF6KicDAAqjaifawEd//OuK9zdHIWqO3XmlEszZqWPsdQhPFkelfzXR+sy3gbPNv+yjT7phsw1sq7zHXeogQFlP5iOQZrf6hCnfXxVk2ckIXMT0UJVZ8FCTwsQi+HKkR/IEj08pR7EjrXGWxHkjv5wNj76spF3FJxtwycS4+KzY3UFy7gYWVn2jB0ha966YgjHMPhzQnT33W9myxGH33M1L5ZCGlfH19hLnqTUNMfzIfw3afxHkL5BFZbhthUPmIfLdLtKmZEkpSTBO/CrNA6CmMfY6xnT78hmwXytEQ+jeiRdKXdr9xQ2j6wVmPzckFKBsBYRe4DprKGt93fnKS9Z6A3Sv626DyZgDa8/NXbtAaBxtyix5Vdt872hYvCzYyB/OuSV6PR5DOq8z3fquOwgtka3rA6qL5gxhFJcO5TqtBM76DzOLd9OLM9bIO1yK9sCmbYynMojkXylzhDfcI8kytS5xs9FJEfwTElZRHkEIQE=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:46:43 localhost sshd[38831]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:43 localhost python3[38848]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.j_w9c8fi' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:44 localhost python3[38866]: ansible-file Invoked with path=/tmp/ansible.j_w9c8fi state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:46:45 localhost python3[38882]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 02:46:45 localhost python3[38898]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:45 localhost python3[38916]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:46 localhost python3[38935]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Feb 20 02:46:47 localhost sshd[39057]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:48 localhost python3[39073]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:49 localhost python3[39090]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:46:52 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 02:46:52 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 02:46:52 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:46:52 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 02:46:52 localhost systemd[1]: Reloading. Feb 20 02:46:52 localhost systemd-rc-local-generator[39206]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:46:52 localhost systemd-sysv-generator[39213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:46:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:46:52 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 02:46:52 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 20 02:46:52 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 20 02:46:52 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 20 02:46:52 localhost systemd[1]: tuned.service: Consumed 1.819s CPU time. Feb 20 02:46:52 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 20 02:46:52 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 02:46:52 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 02:46:52 localhost systemd[1]: run-rddb0f232165f4e7caf1030d5981721e4.service: Deactivated successfully. Feb 20 02:46:54 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 20 02:46:54 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:46:54 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 02:46:54 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 02:46:54 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 02:46:54 localhost systemd[1]: run-radcda4c2443e4fd58d37bb998d45cc0e.service: Deactivated successfully. Feb 20 02:46:55 localhost python3[39529]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:46:55 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 20 02:46:55 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 20 02:46:55 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 20 02:46:55 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 20 02:46:56 localhost sshd[39706]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:46:57 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 20 02:46:57 localhost python3[39725]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:46:58 localhost python3[39742]: ansible-slurp Invoked with src=/etc/tuned/active_profile Feb 20 02:46:58 localhost python3[39759]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:46:59 localhost python3[39775]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:00 localhost python3[39795]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:01 localhost python3[39812]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:47:03 localhost python3[39828]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:06 localhost sshd[39829]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:09 localhost python3[39846]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:09 localhost python3[39894]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:10 localhost python3[39939]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573629.5354686-70985-5671078970862/source _original_basename=tmphuc8ixg9 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:10 localhost python3[39969]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:11 localhost python3[40017]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:11 localhost python3[40060]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573631.1330435-71087-215878468385212/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=5387ef5e5a4b3d23a203db65b8a130e906dc0536 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:12 localhost python3[40122]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:12 localhost python3[40165]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573631.9541054-71141-186975358819900/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=b3e2a3c34ad78c32d8298bcfb96fa0bd48de4c29 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:13 localhost python3[40227]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:13 localhost python3[40270]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573632.8013954-71141-13360655153265/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=9360c8b01c30dc9677a403a9f11e562b9309fb54 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:14 localhost python3[40332]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:14 localhost python3[40375]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573633.7584717-71141-24362054386575/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=1bd75eeb71ad8a06f7ad5bd2e02e7279e09e867f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:14 localhost sshd[40386]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:15 localhost python3[40438]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:15 localhost python3[40481]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573634.7654712-71141-8181946489592/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:15 localhost python3[40544]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:16 localhost python3[40587]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573635.6510413-71141-92921810720287/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=cbfe5bf2a17b805f6637cedd456b7bd33893a9e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:16 localhost python3[40649]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:17 localhost python3[40692]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573636.49638-71141-32355966377171/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:17 localhost python3[40754]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:17 localhost python3[40827]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573637.3079348-71141-137780338074535/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=105f529004e67673ca4edd886c338642e88dedf6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:18 localhost python3[40920]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:18 localhost python3[40963]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573638.1331708-71141-212860132408312/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:19 localhost python3[41040]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:19 localhost python3[41083]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573638.9639196-71141-60813725389109/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:20 localhost python3[41145]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:20 localhost python3[41188]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573639.808891-71141-33791909834788/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=1a54ea8224417f04a01b19de6c5231a702bdb41b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:21 localhost sshd[41203]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:21 localhost python3[41219]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:47:22 localhost python3[41268]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:47:22 localhost python3[41311]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573641.833584-71995-12021929881542/source _original_basename=tmpcp_tb351 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:47:26 localhost python3[41341]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 02:47:27 localhost python3[41402]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:28 localhost sshd[41404]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:32 localhost python3[41421]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:32 localhost sshd[41423]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:36 localhost sshd[41425]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:37 localhost python3[41441]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:37 localhost python3[41464]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:42 localhost python3[41482]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:42 localhost python3[41505]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:43 localhost sshd[41507]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:47 localhost python3[41524]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:48 localhost sshd[41526]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:51 localhost python3[41543]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:52 localhost python3[41566]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:47:55 localhost sshd[41568]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:47:56 localhost systemd[36724]: Starting Mark boot as successful... Feb 20 02:47:56 localhost systemd[36724]: Finished Mark boot as successful. Feb 20 02:47:56 localhost python3[41586]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:01 localhost python3[41603]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:01 localhost python3[41626]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:05 localhost sshd[41628]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:06 localhost python3[41644]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:10 localhost python3[41662]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:10 localhost python3[41685]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:15 localhost python3[41702]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:18 localhost sshd[41704]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:21 localhost sshd[41783]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:21 localhost python3[41784]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:21 localhost python3[41833]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:21 localhost python3[41851]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpewy50165 recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:22 localhost python3[41896]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:23 localhost python3[41944]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:23 localhost python3[41962]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:23 localhost python3[42024]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:25 localhost python3[42042]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:25 localhost sshd[42043]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:25 localhost sshd[42044]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:26 localhost python3[42107]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:26 localhost python3[42125]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:27 localhost python3[42188]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:27 localhost python3[42206]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:27 localhost python3[42268]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:28 localhost python3[42286]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:28 localhost python3[42348]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:28 localhost python3[42366]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:29 localhost python3[42428]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:29 localhost python3[42446]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:30 localhost python3[42508]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:30 localhost python3[42526]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:30 localhost sshd[42573]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:31 localhost python3[42589]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:31 localhost python3[42607]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:31 localhost python3[42670]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:32 localhost python3[42688]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:32 localhost python3[42750]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:32 localhost python3[42768]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:33 localhost python3[42798]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:48:33 localhost python3[42846]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:34 localhost python3[42864]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpxnjlkiz1 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:36 localhost python3[42894]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:48:38 localhost sshd[42896]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:41 localhost python3[42913]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:48:41 localhost python3[42931]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:48:43 localhost python3[42949]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:48:43 localhost systemd[1]: Reloading. Feb 20 02:48:43 localhost systemd-sysv-generator[42981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:48:43 localhost systemd-rc-local-generator[42975]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:48:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:48:43 localhost systemd[1]: Starting Netfilter Tables... Feb 20 02:48:43 localhost systemd[1]: Finished Netfilter Tables. Feb 20 02:48:44 localhost python3[43039]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:44 localhost python3[43082]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573724.157667-74834-258154693612869/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:45 localhost python3[43112]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:45 localhost python3[43130]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:46 localhost python3[43179]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:46 localhost python3[43222]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573725.875214-74947-86404261163006/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:47 localhost python3[43284]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:47 localhost python3[43327]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573726.7924778-75153-33929115001382/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:47 localhost sshd[43363]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:48 localhost python3[43390]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:48 localhost python3[43433]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573727.826416-75217-194572242159236/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:49 localhost python3[43495]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:49 localhost python3[43538]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573728.793991-75274-74104834022578/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:50 localhost python3[43601]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:48:50 localhost python3[43644]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573729.673532-75313-178440309902299/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:51 localhost python3[43674]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:51 localhost python3[43739]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:48:52 localhost sshd[43743]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:52 localhost python3[43758]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:52 localhost python3[43775]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:48:53 localhost python3[43794]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:48:53 localhost python3[43810]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:48:53 localhost python3[43826]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:48:54 localhost python3[43842]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 20 02:48:55 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=7 res=1 Feb 20 02:48:55 localhost python3[43863]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 20 02:48:55 localhost sshd[43864]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:48:56 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 20 02:48:56 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:48:56 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:48:56 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:48:56 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:48:56 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:48:56 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:48:56 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:48:56 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=8 res=1 Feb 20 02:48:56 localhost python3[43886]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 20 02:48:57 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 20 02:48:57 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:48:57 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:48:57 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:48:57 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:48:57 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:48:57 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:48:57 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:48:57 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=9 res=1 Feb 20 02:48:58 localhost python3[43907]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 20 02:48:58 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 20 02:48:58 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:48:58 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:48:58 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:48:58 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:48:58 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:48:58 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:48:58 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:48:59 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=10 res=1 Feb 20 02:48:59 localhost python3[43928]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:48:59 localhost python3[43944]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:00 localhost python3[43960]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:00 localhost python3[43976]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:49:00 localhost python3[43992]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:01 localhost python3[44009]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:49:01 localhost sshd[44011]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:03 localhost sshd[44013]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:05 localhost python3[44030]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:05 localhost python3[44078]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:06 localhost python3[44121]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573745.5980115-76140-43270010184730/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:06 localhost python3[44151]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:49:06 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 20 02:49:06 localhost systemd[1]: Stopped Load Kernel Modules. Feb 20 02:49:06 localhost systemd[1]: Stopping Load Kernel Modules... Feb 20 02:49:06 localhost systemd[1]: Starting Load Kernel Modules... Feb 20 02:49:06 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 20 02:49:06 localhost kernel: Bridge firewalling registered Feb 20 02:49:06 localhost systemd-modules-load[44154]: Inserted module 'br_netfilter' Feb 20 02:49:06 localhost systemd-modules-load[44154]: Module 'msr' is built in Feb 20 02:49:06 localhost systemd[1]: Finished Load Kernel Modules. Feb 20 02:49:06 localhost sshd[44158]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:07 localhost python3[44206]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:07 localhost python3[44249]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573747.0509849-76179-9692904317391/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:08 localhost python3[44279]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:08 localhost python3[44296]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:08 localhost python3[44314]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:09 localhost python3[44332]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:09 localhost python3[44350]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:09 localhost sshd[44364]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:09 localhost python3[44369]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:09 localhost python3[44386]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:10 localhost python3[44404]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:10 localhost python3[44422]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:10 localhost python3[44440]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:11 localhost python3[44458]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:11 localhost python3[44476]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:11 localhost python3[44494]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:12 localhost python3[44512]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:12 localhost python3[44529]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:12 localhost python3[44546]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:13 localhost python3[44563]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:13 localhost sshd[44565]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:13 localhost python3[44581]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 20 02:49:13 localhost python3[44600]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 02:49:13 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 20 02:49:13 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 20 02:49:13 localhost systemd[1]: Stopping Apply Kernel Variables... Feb 20 02:49:13 localhost systemd[1]: Starting Apply Kernel Variables... Feb 20 02:49:13 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 20 02:49:13 localhost systemd[1]: Finished Apply Kernel Variables. Feb 20 02:49:14 localhost python3[44620]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:14 localhost python3[44636]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:14 localhost python3[44652]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:15 localhost python3[44668]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:49:15 localhost python3[44684]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:15 localhost python3[44700]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:16 localhost python3[44716]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:16 localhost python3[44732]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:16 localhost python3[44748]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:17 localhost python3[44796]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:17 localhost python3[44839]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573757.0013485-76569-164355237576144/source _original_basename=tmpysrkc7db follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:18 localhost python3[44869]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:19 localhost python3[44886]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:20 localhost python3[44934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:20 localhost python3[44977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573760.2712302-76911-84892213949368/source _original_basename=tmphskfpoqi follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:21 localhost sshd[45007]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:21 localhost python3[45008]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:21 localhost python3[45024]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:22 localhost python3[45040]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:22 localhost python3[45086]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:22 localhost python3[45122]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:23 localhost python3[45139]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:23 localhost python3[45185]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:23 localhost python3[45220]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:23 localhost python3[45248]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:24 localhost python3[45272]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Feb 20 02:49:24 localhost python3[45301]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 20 02:49:25 localhost python3[45325]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Feb 20 02:49:25 localhost python3[45341]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:26 localhost python3[45390]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:26 localhost python3[45433]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573765.7865148-77198-41640684322259/source _original_basename=tmpt9istg74 follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:26 localhost python3[45463]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Feb 20 02:49:27 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=11 res=1 Feb 20 02:49:27 localhost python3[45483]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:28 localhost python3[45499]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:28 localhost python3[45515]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Feb 20 02:49:29 localhost sshd[45535]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:29 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=12 res=1 Feb 20 02:49:30 localhost python3[45537]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:49:33 localhost python3[45555]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 02:49:34 localhost python3[45616]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:34 localhost python3[45632]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:35 localhost python3[45691]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:35 localhost python3[45734]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573774.6713738-77580-66630746460930/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=0c6402bd7c36c2824760eb4c5e728ced7f603318 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:36 localhost python3[45796]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:36 localhost python3[45841]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573775.7108946-77635-83902544214929/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:36 localhost python3[45871]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:37 localhost python3[45887]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:37 localhost python3[45903]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:37 localhost python3[45919]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:38 localhost sshd[45968]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:38 localhost python3[45967]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:38 localhost python3[46011]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573778.3062062-77751-104110974457188/source _original_basename=tmp41pyhktf follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:39 localhost python3[46041]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:39 localhost python3[46057]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:40 localhost python3[46074]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:49:44 localhost sshd[46108]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:44 localhost python3[46124]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:44 localhost python3[46169]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573783.9418566-77982-198255911223850/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:45 localhost python3[46201]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:49:45 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 20 02:49:45 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 20 02:49:45 localhost systemd[1]: sshd.service: Unit process 46108 (sshd) remains running after unit stopped. Feb 20 02:49:45 localhost systemd[1]: sshd.service: Unit process 46171 (sshd) remains running after unit stopped. Feb 20 02:49:45 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 20 02:49:45 localhost systemd[1]: sshd.service: Consumed 13.674s CPU time, read 1.9M from disk, written 1.1M to disk. Feb 20 02:49:45 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 20 02:49:45 localhost systemd[1]: Stopping sshd-keygen.target... Feb 20 02:49:45 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 02:49:45 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 02:49:45 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 02:49:45 localhost systemd[1]: Reached target sshd-keygen.target. Feb 20 02:49:45 localhost systemd[1]: Starting OpenSSH server daemon... Feb 20 02:49:45 localhost sshd[46205]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:45 localhost systemd[1]: Started OpenSSH server daemon. Feb 20 02:49:45 localhost python3[46221]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:46 localhost python3[46239]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:46 localhost sshd[46258]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:46 localhost python3[46257]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:49:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 02:49:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3258 writes, 16K keys, 3258 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3258 writes, 145 syncs, 22.47 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3258 writes, 16K keys, 3258 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s#012Interval WAL: 3258 writes, 145 syncs, 22.47 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Feb 20 02:49:47 localhost sshd[46261]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:50 localhost python3[46310]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:50 localhost python3[46328]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 02:49:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s#012Interval WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Feb 20 02:49:51 localhost python3[46358]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:49:52 localhost python3[46408]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:52 localhost python3[46426]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:49:53 localhost python3[46456]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:49:53 localhost systemd[1]: Reloading. Feb 20 02:49:53 localhost sshd[46458]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:53 localhost systemd-sysv-generator[46488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:49:53 localhost systemd-rc-local-generator[46483]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:49:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:49:53 localhost systemd[1]: Starting chronyd online sources service... Feb 20 02:49:53 localhost chronyc[46498]: 200 OK Feb 20 02:49:53 localhost systemd[1]: chrony-online.service: Deactivated successfully. Feb 20 02:49:53 localhost systemd[1]: Finished chronyd online sources service. Feb 20 02:49:53 localhost python3[46514]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:53 localhost chronyd[26351]: System clock was stepped by -0.000137 seconds Feb 20 02:49:54 localhost python3[46532]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:54 localhost python3[46549]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:54 localhost chronyd[26351]: System clock was stepped by 0.000000 seconds Feb 20 02:49:54 localhost python3[46566]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:55 localhost python3[46583]: ansible-timezone Invoked with name=UTC hwclock=None Feb 20 02:49:55 localhost systemd[1]: Starting Time & Date Service... Feb 20 02:49:55 localhost systemd[1]: Started Time & Date Service. Feb 20 02:49:55 localhost sshd[46588]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:56 localhost python3[46605]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:57 localhost python3[46622]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:49:57 localhost sshd[46624]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:49:57 localhost python3[46640]: ansible-slurp Invoked with src=/etc/tuned/active_profile Feb 20 02:49:58 localhost python3[46656]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:49:58 localhost python3[46672]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:59 localhost python3[46689]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:49:59 localhost python3[46737]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:49:59 localhost python3[46780]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573799.1928504-79045-211679343598931/source _original_basename=tmpsefv5d1g follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:00 localhost python3[46842]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:50:00 localhost python3[46885]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573800.085099-79098-107142424176482/source _original_basename=tmpnuzabp4h follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:01 localhost python3[46915]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 20 02:50:01 localhost systemd[1]: Reloading. Feb 20 02:50:01 localhost systemd-sysv-generator[46942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:50:01 localhost systemd-rc-local-generator[46938]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:50:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:50:02 localhost python3[46968]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:50:02 localhost python3[46984]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:50:02 localhost python3[47001]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:50:02 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Feb 20 02:50:03 localhost python3[47018]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:50:03 localhost python3[47034]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:04 localhost python3[47082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:50:04 localhost python3[47125]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573803.7573938-79278-60573188786964/source _original_basename=tmpcwpp_zig follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:04 localhost sshd[47140]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:10 localhost sshd[47142]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:16 localhost sshd[47144]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:25 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 20 02:50:27 localhost sshd[47225]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:28 localhost python3[47242]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:50:28 localhost python3[47258]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Feb 20 02:50:28 localhost python3[47274]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:50:29 localhost python3[47290]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:29 localhost python3[47306]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:30 localhost python3[47322]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 20 02:50:30 localhost kernel: SELinux: Converting 2707 SID table entries... Feb 20 02:50:30 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:50:30 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:50:30 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:50:30 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:50:30 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:50:30 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:50:30 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:50:31 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=13 res=1 Feb 20 02:50:31 localhost python3[47343]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:50:32 localhost sshd[47451]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:32 localhost python3[47481]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Feb 20 02:50:33 localhost rsyslogd[758]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Feb 20 02:50:34 localhost python3[47498]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:50:34 localhost python3[47514]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:50:34 localhost python3[47530]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n -iNONE', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Feb 20 02:50:38 localhost sshd[47531]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:39 localhost python3[47580]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:50:40 localhost python3[47623]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573839.4949474-80839-190790480198677/source _original_basename=tmphya55sug follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:50:40 localhost python3[47653]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:50:42 localhost python3[47776]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:50:42 localhost sshd[47777]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:44 localhost python3[47899]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 20 02:50:46 localhost sshd[47916]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:46 localhost python3[47915]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:50:47 localhost python3[47933]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 02:50:51 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 02:50:51 localhost dbus-broker-launch[17399]: Noticed file-system modification, trigger reload. Feb 20 02:50:51 localhost dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Feb 20 02:50:51 localhost dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Feb 20 02:50:51 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 02:50:52 localhost systemd[1]: Reexecuting. Feb 20 02:50:52 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 20 02:50:52 localhost systemd[1]: Detected virtualization kvm. Feb 20 02:50:52 localhost systemd[1]: Detected architecture x86-64. Feb 20 02:50:52 localhost systemd-rc-local-generator[47986]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:50:52 localhost systemd-sysv-generator[47993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:50:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:50:55 localhost sshd[48009]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:50:56 localhost systemd[36724]: Created slice User Background Tasks Slice. Feb 20 02:50:56 localhost systemd[36724]: Starting Cleanup of User's Temporary Files and Directories... Feb 20 02:50:56 localhost systemd[36724]: Finished Cleanup of User's Temporary Files and Directories. Feb 20 02:51:00 localhost kernel: SELinux: Converting 2707 SID table entries... Feb 20 02:51:00 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 02:51:00 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 02:51:00 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 02:51:00 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 02:51:00 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 02:51:00 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 02:51:00 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 02:51:00 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 02:51:00 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=14 res=1 Feb 20 02:51:00 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 02:51:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:51:01 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 02:51:01 localhost systemd[1]: Reloading. Feb 20 02:51:02 localhost systemd-sysv-generator[48070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:51:02 localhost systemd-rc-local-generator[48066]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:51:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:51:02 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 02:51:02 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 02:51:02 localhost systemd-journald[618]: Journal stopped Feb 20 02:51:02 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Feb 20 02:51:02 localhost systemd[1]: Stopping Journal Service... Feb 20 02:51:02 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Feb 20 02:51:02 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Feb 20 02:51:02 localhost systemd[1]: Stopped Journal Service. Feb 20 02:51:02 localhost systemd[1]: systemd-journald.service: Consumed 2.438s CPU time. Feb 20 02:51:02 localhost systemd[1]: Starting Journal Service... Feb 20 02:51:02 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 20 02:51:02 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Feb 20 02:51:02 localhost systemd[1]: systemd-udevd.service: Consumed 2.953s CPU time. Feb 20 02:51:02 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 20 02:51:02 localhost systemd-journald[48359]: Journal started Feb 20 02:51:02 localhost systemd-journald[48359]: Runtime Journal (/run/log/journal/01f46965e72fd8a157841feaa66c8d52) is 12.7M, max 314.7M, 302.0M free. Feb 20 02:51:02 localhost systemd[1]: Started Journal Service. Feb 20 02:51:02 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 20 02:51:02 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 02:51:02 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 02:51:02 localhost systemd-udevd[48360]: Using default interface naming scheme 'rhel-9.0'. Feb 20 02:51:02 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 20 02:51:02 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 02:51:02 localhost systemd[1]: Reloading. Feb 20 02:51:02 localhost systemd-sysv-generator[49122]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:51:02 localhost systemd-rc-local-generator[49119]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:51:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:51:02 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 02:51:03 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 02:51:03 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 02:51:03 localhost systemd[1]: run-rc0cdca9920d74441b0a37776021354c5.service: Deactivated successfully. Feb 20 02:51:03 localhost systemd[1]: run-rc9b08c4be8c0410ea931fd2b756f1180.service: Deactivated successfully. Feb 20 02:51:05 localhost sshd[49414]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:07 localhost python3[49433]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Feb 20 02:51:07 localhost python3[49452]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 02:51:08 localhost python3[49470]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:51:08 localhost python3[49470]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Feb 20 02:51:08 localhost python3[49470]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Feb 20 02:51:16 localhost podman[49484]: 2026-02-20 07:51:08.559441929 +0000 UTC m=+0.039625571 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 20 02:51:16 localhost python3[49470]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 591bb9fb46a70e9f840f28502388406078442df6b6701a3c17990ee75e333673 --format json Feb 20 02:51:16 localhost sshd[49573]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:19 localhost python3[49591]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:51:19 localhost python3[49591]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Feb 20 02:51:19 localhost python3[49591]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Feb 20 02:51:23 localhost sshd[49641]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:25 localhost sshd[49656]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:28 localhost podman[49603]: 2026-02-20 07:51:20.029869189 +0000 UTC m=+0.023600713 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 20 02:51:28 localhost python3[49591]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d59b33e7fb841c47a47a12b18fb68b11debd968b4596c63f3177ecc7400fb1bc --format json Feb 20 02:51:28 localhost sshd[49743]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:28 localhost podman[49809]: 2026-02-20 07:51:28.986256823 +0000 UTC m=+0.078347269 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1770267347, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 02:51:29 localhost podman[49809]: 2026-02-20 07:51:29.056455279 +0000 UTC m=+0.148545685 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 02:51:29 localhost python3[49806]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:51:29 localhost python3[49806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Feb 20 02:51:29 localhost python3[49806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Feb 20 02:51:32 localhost sshd[49988]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:48 localhost podman[49888]: 2026-02-20 07:51:29.369983526 +0000 UTC m=+0.035851213 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 02:51:48 localhost python3[49806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 6eddd23e1e6adfbfa713a747123707c02f92ffdbf1913da92f171aba1d6d7856 --format json Feb 20 02:51:48 localhost sshd[50333]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:51:48 localhost python3[50332]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:51:48 localhost python3[50332]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Feb 20 02:51:48 localhost python3[50332]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Feb 20 02:52:00 localhost sshd[50397]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:01 localhost podman[50346]: 2026-02-20 07:51:48.535033434 +0000 UTC m=+0.028081231 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 02:52:01 localhost python3[50332]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 2c8610235afe953aa46efb141a5a988799548b22280d65a7e7ab21889422df37 --format json Feb 20 02:52:01 localhost python3[50431]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:01 localhost python3[50431]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Feb 20 02:52:01 localhost python3[50431]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Feb 20 02:52:08 localhost sshd[50737]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:10 localhost podman[50444]: 2026-02-20 07:52:01.771158909 +0000 UTC m=+0.046988597 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 20 02:52:10 localhost python3[50431]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ab5aab6d0c3ec80926032b7acf4cec1d4710f1c2daccd17ae4daa64399ec237 --format json Feb 20 02:52:11 localhost python3[50792]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:11 localhost python3[50792]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Feb 20 02:52:11 localhost python3[50792]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Feb 20 02:52:11 localhost sshd[50817]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:14 localhost sshd[50856]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:15 localhost podman[50804]: 2026-02-20 07:52:11.235088033 +0000 UTC m=+0.045336466 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 20 02:52:15 localhost python3[50792]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 4853142d85dba3766b28d28ae195b26f7242230fe3646e9590a7aee2dc2e0dfa --format json Feb 20 02:52:15 localhost python3[50886]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:15 localhost python3[50886]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Feb 20 02:52:15 localhost python3[50886]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Feb 20 02:52:17 localhost podman[50899]: 2026-02-20 07:52:15.73509789 +0000 UTC m=+0.042531429 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 20 02:52:17 localhost python3[50886]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ac6ea63c0fb4851145e847f9ced2f20804afc8472907b63a82d5866f5cf608a --format json Feb 20 02:52:18 localhost python3[50976]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:18 localhost python3[50976]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Feb 20 02:52:18 localhost python3[50976]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Feb 20 02:52:18 localhost sshd[51002]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:20 localhost podman[50989]: 2026-02-20 07:52:18.229254902 +0000 UTC m=+0.043812659 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 20 02:52:20 localhost python3[50976]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ba1a08ea1c1207b471b1f02cee16ff456b8a812662cce16906d16de330a66d63 --format json Feb 20 02:52:20 localhost python3[51069]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:20 localhost python3[51069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Feb 20 02:52:20 localhost python3[51069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Feb 20 02:52:22 localhost podman[51082]: 2026-02-20 07:52:20.700589607 +0000 UTC m=+0.034146876 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 20 02:52:22 localhost python3[51069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 8576d3a17e57ea28f29435f132f583320941b5aa7bf0aa02e998b09a094d1fe8 --format json Feb 20 02:52:22 localhost python3[51160]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:22 localhost python3[51160]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Feb 20 02:52:23 localhost python3[51160]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Feb 20 02:52:25 localhost sshd[51210]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:26 localhost podman[51172]: 2026-02-20 07:52:23.070355153 +0000 UTC m=+0.034608610 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 20 02:52:26 localhost python3[51160]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7fcbf63c0504494c8fcaa07583f909a06486472a0982aeac9554c6fdbeb04c9a --format json Feb 20 02:52:27 localhost python3[51262]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 20 02:52:27 localhost python3[51262]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Feb 20 02:52:27 localhost python3[51262]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Feb 20 02:52:28 localhost podman[51275]: 2026-02-20 07:52:27.180270735 +0000 UTC m=+0.041764263 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 20 02:52:28 localhost python3[51262]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 72ddf109f135b64d3116af7b84caaa358dc72e2e60f4c8753fa54fa65b76ba35 --format json Feb 20 02:52:29 localhost python3[51352]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:52:31 localhost ansible-async_wrapper.py[51524]: Invoked with 700999029294 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573950.6461928-83846-246300545821431/AnsiballZ_command.py _ Feb 20 02:52:31 localhost ansible-async_wrapper.py[51527]: Starting module and watcher Feb 20 02:52:31 localhost ansible-async_wrapper.py[51527]: Start watching 51528 (3600) Feb 20 02:52:31 localhost ansible-async_wrapper.py[51528]: Start module (51528) Feb 20 02:52:31 localhost ansible-async_wrapper.py[51524]: Return async_wrapper task started. Feb 20 02:52:31 localhost python3[51545]: ansible-ansible.legacy.async_status Invoked with jid=700999029294.51524 mode=status _async_dir=/tmp/.ansible_async Feb 20 02:52:35 localhost puppet-user[51548]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:52:35 localhost puppet-user[51548]: (file: /etc/puppet/hiera.yaml) Feb 20 02:52:35 localhost puppet-user[51548]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:52:35 localhost puppet-user[51548]: (file & line not available) Feb 20 02:52:35 localhost puppet-user[51548]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:52:35 localhost puppet-user[51548]: (file & line not available) Feb 20 02:52:35 localhost puppet-user[51548]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 20 02:52:35 localhost puppet-user[51548]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 20 02:52:35 localhost puppet-user[51548]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.16 seconds Feb 20 02:52:35 localhost puppet-user[51548]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Feb 20 02:52:35 localhost puppet-user[51548]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Feb 20 02:52:35 localhost puppet-user[51548]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Feb 20 02:52:35 localhost puppet-user[51548]: Notice: Applied catalog in 0.05 seconds Feb 20 02:52:35 localhost puppet-user[51548]: Application: Feb 20 02:52:35 localhost puppet-user[51548]: Initial environment: production Feb 20 02:52:35 localhost puppet-user[51548]: Converged environment: production Feb 20 02:52:35 localhost puppet-user[51548]: Run mode: user Feb 20 02:52:35 localhost puppet-user[51548]: Changes: Feb 20 02:52:35 localhost puppet-user[51548]: Total: 3 Feb 20 02:52:35 localhost puppet-user[51548]: Events: Feb 20 02:52:35 localhost puppet-user[51548]: Success: 3 Feb 20 02:52:35 localhost puppet-user[51548]: Total: 3 Feb 20 02:52:35 localhost puppet-user[51548]: Resources: Feb 20 02:52:35 localhost puppet-user[51548]: Changed: 3 Feb 20 02:52:35 localhost puppet-user[51548]: Out of sync: 3 Feb 20 02:52:35 localhost puppet-user[51548]: Total: 10 Feb 20 02:52:35 localhost puppet-user[51548]: Time: Feb 20 02:52:35 localhost puppet-user[51548]: Schedule: 0.00 Feb 20 02:52:35 localhost puppet-user[51548]: File: 0.00 Feb 20 02:52:35 localhost puppet-user[51548]: Exec: 0.02 Feb 20 02:52:35 localhost puppet-user[51548]: Augeas: 0.02 Feb 20 02:52:35 localhost puppet-user[51548]: Transaction evaluation: 0.05 Feb 20 02:52:35 localhost puppet-user[51548]: Catalog application: 0.05 Feb 20 02:52:35 localhost puppet-user[51548]: Config retrieval: 0.20 Feb 20 02:52:35 localhost puppet-user[51548]: Last run: 1771573955 Feb 20 02:52:35 localhost puppet-user[51548]: Filebucket: 0.00 Feb 20 02:52:35 localhost puppet-user[51548]: Total: 0.06 Feb 20 02:52:35 localhost puppet-user[51548]: Version: Feb 20 02:52:35 localhost puppet-user[51548]: Config: 1771573955 Feb 20 02:52:35 localhost puppet-user[51548]: Puppet: 7.10.0 Feb 20 02:52:35 localhost ansible-async_wrapper.py[51528]: Module complete (51528) Feb 20 02:52:35 localhost sshd[51660]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:36 localhost ansible-async_wrapper.py[51527]: Done in kid B. Feb 20 02:52:42 localhost python3[51692]: ansible-ansible.legacy.async_status Invoked with jid=700999029294.51524 mode=status _async_dir=/tmp/.ansible_async Feb 20 02:52:47 localhost sshd[51739]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:51 localhost python3[51771]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:52:51 localhost python3[51788]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:52:52 localhost python3[51836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:52:52 localhost python3[51879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573972.1343498-84213-47671038942375/source _original_basename=tmp0uv_bnxo follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:52:55 localhost python3[51909]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:52:56 localhost python3[52012]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 20 02:52:56 localhost python3[52031]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 02:52:57 localhost python3[52047]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005625204 step=1 update_config_hash_only=False Feb 20 02:52:57 localhost python3[52195]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:52:58 localhost python3[52211]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 20 02:52:58 localhost sshd[52212]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:52:58 localhost python3[52228]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Feb 20 02:52:59 localhost python3[52271]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Feb 20 02:53:00 localhost podman[52439]: 2026-02-20 07:53:00.111720753 +0000 UTC m=+0.102959229 container create 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5) Feb 20 02:53:00 localhost podman[52450]: 2026-02-20 07:53:00.130562541 +0000 UTC m=+0.110981029 container create aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, container_name=container-puppet-collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Feb 20 02:53:00 localhost podman[52439]: 2026-02-20 07:53:00.045427992 +0000 UTC m=+0.036666488 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 02:53:00 localhost podman[52450]: 2026-02-20 07:53:00.055300223 +0000 UTC m=+0.035718721 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 20 02:53:00 localhost podman[52451]: 2026-02-20 07:53:00.166229259 +0000 UTC m=+0.144557997 container create 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:34:43Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 20 02:53:00 localhost systemd[1]: Started libpod-conmon-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope. Feb 20 02:53:00 localhost systemd[1]: Started libpod-conmon-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope. Feb 20 02:53:00 localhost systemd[1]: Started libcrun container. Feb 20 02:53:00 localhost systemd[1]: Started libcrun container. Feb 20 02:53:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f138d9d6c461962e8cf2ee8539c9294af2f13aab0c8b266d53219a78c733e21/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:00 localhost podman[52443]: 2026-02-20 07:53:00.100357039 +0000 UTC m=+0.083926507 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 20 02:53:00 localhost systemd[1]: Started libpod-conmon-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope. Feb 20 02:53:00 localhost podman[52451]: 2026-02-20 07:53:00.10075017 +0000 UTC m=+0.079078898 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 20 02:53:00 localhost podman[52470]: 2026-02-20 07:53:00.102606803 +0000 UTC m=+0.068439034 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 20 02:53:00 localhost podman[52439]: 2026-02-20 07:53:00.203601366 +0000 UTC m=+0.194839842 container init 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-nova_libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z) Feb 20 02:53:00 localhost systemd[1]: Started libcrun container. Feb 20 02:53:00 localhost podman[52439]: 2026-02-20 07:53:00.21146629 +0000 UTC m=+0.202704776 container start 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 20 02:53:00 localhost podman[52439]: 2026-02-20 07:53:00.211785279 +0000 UTC m=+0.203023775 container attach 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=container-puppet-nova_libvirt) Feb 20 02:53:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:00 localhost podman[52451]: 2026-02-20 07:53:00.22161548 +0000 UTC m=+0.199944188 container init 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=container-puppet-iscsid, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-iscsid-container) Feb 20 02:53:00 localhost podman[52443]: 2026-02-20 07:53:00.231345578 +0000 UTC m=+0.214915026 container create 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public) Feb 20 02:53:00 localhost systemd[1]: Started libpod-conmon-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope. Feb 20 02:53:00 localhost systemd[1]: Started libcrun container. Feb 20 02:53:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b4a27664720ce930aee8034c0e3a2e981bce86564061fc7e3c5cc60116ab629/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:00 localhost podman[52450]: 2026-02-20 07:53:00.346462363 +0000 UTC m=+0.326880861 container init aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, url=https://www.redhat.com, container_name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Feb 20 02:53:00 localhost podman[52450]: 2026-02-20 07:53:00.352800374 +0000 UTC m=+0.333218862 container start aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 20 02:53:00 localhost podman[52450]: 2026-02-20 07:53:00.353002689 +0000 UTC m=+0.333421197 container attach aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 02:53:00 localhost podman[52443]: 2026-02-20 07:53:00.377589911 +0000 UTC m=+0.361159339 container init 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 02:53:00 localhost podman[52470]: 2026-02-20 07:53:00.384037936 +0000 UTC m=+0.349870117 container create 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=container-puppet-metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 20 02:53:00 localhost systemd[1]: Started libpod-conmon-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope. Feb 20 02:53:00 localhost systemd[1]: Started libcrun container. Feb 20 02:53:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17d67d7c6c3046ba2041c4048263641e426665d92e1e8fa18e3c871ca9222f66/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:00 localhost podman[52451]: 2026-02-20 07:53:00.437173853 +0000 UTC m=+0.415502551 container start 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 02:53:00 localhost podman[52451]: 2026-02-20 07:53:00.437416359 +0000 UTC m=+0.415745167 container attach 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_puppet_step1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z) Feb 20 02:53:00 localhost podman[52443]: 2026-02-20 07:53:00.440385064 +0000 UTC m=+0.423954492 container start 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 20 02:53:00 localhost podman[52443]: 2026-02-20 07:53:00.440627261 +0000 UTC m=+0.424196709 container attach 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=container-puppet-crond, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 02:53:00 localhost podman[52470]: 2026-02-20 07:53:00.443558505 +0000 UTC m=+0.409390706 container init 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, release=1766032510) Feb 20 02:53:00 localhost podman[52470]: 2026-02-20 07:53:00.458697816 +0000 UTC m=+0.424529997 container start 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, container_name=container-puppet-metrics_qdr, build-date=2026-01-12T22:10:14Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 02:53:00 localhost podman[52470]: 2026-02-20 07:53:00.459160259 +0000 UTC m=+0.424992480 container attach 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, architecture=x86_64) Feb 20 02:53:00 localhost sshd[52602]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:02 localhost puppet-user[52556]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:02 localhost puppet-user[52556]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:02 localhost puppet-user[52556]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:02 localhost puppet-user[52556]: (file & line not available) Feb 20 02:53:02 localhost puppet-user[52556]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:02 localhost puppet-user[52556]: (file & line not available) Feb 20 02:53:02 localhost puppet-user[52556]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.11 seconds Feb 20 02:53:04 localhost puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Feb 20 02:53:04 localhost puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Feb 20 02:53:04 localhost puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Feb 20 02:53:05 localhost ovs-vsctl[52713]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Feb 20 02:53:05 localhost puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Feb 20 02:53:05 localhost puppet-user[52556]: Notice: Applied catalog in 2.69 seconds Feb 20 02:53:05 localhost puppet-user[52556]: Application: Feb 20 02:53:05 localhost puppet-user[52556]: Initial environment: production Feb 20 02:53:05 localhost puppet-user[52556]: Converged environment: production Feb 20 02:53:05 localhost puppet-user[52556]: Run mode: user Feb 20 02:53:05 localhost puppet-user[52556]: Changes: Feb 20 02:53:05 localhost puppet-user[52556]: Total: 4 Feb 20 02:53:05 localhost puppet-user[52556]: Events: Feb 20 02:53:05 localhost puppet-user[52556]: Success: 4 Feb 20 02:53:05 localhost puppet-user[52556]: Total: 4 Feb 20 02:53:05 localhost puppet-user[52556]: Resources: Feb 20 02:53:05 localhost puppet-user[52556]: Changed: 4 Feb 20 02:53:05 localhost puppet-user[52556]: Out of sync: 4 Feb 20 02:53:05 localhost puppet-user[52556]: Skipped: 8 Feb 20 02:53:05 localhost puppet-user[52556]: Total: 13 Feb 20 02:53:05 localhost puppet-user[52556]: Time: Feb 20 02:53:05 localhost puppet-user[52556]: File: 0.00 Feb 20 02:53:05 localhost puppet-user[52556]: Config retrieval: 0.15 Feb 20 02:53:05 localhost puppet-user[52556]: Augeas: 0.79 Feb 20 02:53:05 localhost puppet-user[52556]: Exec: 1.88 Feb 20 02:53:05 localhost puppet-user[52556]: Last run: 1771573985 Feb 20 02:53:05 localhost puppet-user[52556]: Transaction evaluation: 2.68 Feb 20 02:53:05 localhost puppet-user[52556]: Catalog application: 2.69 Feb 20 02:53:05 localhost puppet-user[52556]: Total: 2.69 Feb 20 02:53:05 localhost puppet-user[52556]: Version: Feb 20 02:53:05 localhost puppet-user[52556]: Config: 1771573982 Feb 20 02:53:05 localhost puppet-user[52556]: Puppet: 7.10.0 Feb 20 02:53:05 localhost puppet-user[52569]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:05 localhost puppet-user[52569]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:05 localhost puppet-user[52569]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:05 localhost puppet-user[52569]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52584]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:05 localhost puppet-user[52584]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:05 localhost puppet-user[52584]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:05 localhost puppet-user[52584]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52553]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:05 localhost puppet-user[52553]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:05 localhost puppet-user[52553]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:05 localhost puppet-user[52553]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52569]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:05 localhost puppet-user[52569]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52584]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:05 localhost puppet-user[52584]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52553]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:05 localhost puppet-user[52553]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52584]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.07 seconds Feb 20 02:53:05 localhost puppet-user[52601]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:05 localhost puppet-user[52601]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:05 localhost puppet-user[52584]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Feb 20 02:53:05 localhost puppet-user[52601]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:05 localhost puppet-user[52601]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52584]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Feb 20 02:53:05 localhost puppet-user[52584]: Notice: Applied catalog in 0.11 seconds Feb 20 02:53:05 localhost puppet-user[52584]: Application: Feb 20 02:53:05 localhost puppet-user[52584]: Initial environment: production Feb 20 02:53:05 localhost puppet-user[52584]: Converged environment: production Feb 20 02:53:05 localhost puppet-user[52584]: Run mode: user Feb 20 02:53:05 localhost puppet-user[52584]: Changes: Feb 20 02:53:05 localhost puppet-user[52584]: Total: 2 Feb 20 02:53:05 localhost puppet-user[52584]: Events: Feb 20 02:53:05 localhost puppet-user[52584]: Success: 2 Feb 20 02:53:05 localhost puppet-user[52584]: Total: 2 Feb 20 02:53:05 localhost puppet-user[52584]: Resources: Feb 20 02:53:05 localhost puppet-user[52584]: Changed: 2 Feb 20 02:53:05 localhost puppet-user[52584]: Out of sync: 2 Feb 20 02:53:05 localhost puppet-user[52584]: Skipped: 7 Feb 20 02:53:05 localhost puppet-user[52584]: Total: 9 Feb 20 02:53:05 localhost puppet-user[52584]: Time: Feb 20 02:53:05 localhost puppet-user[52584]: Cron: 0.01 Feb 20 02:53:05 localhost puppet-user[52584]: File: 0.08 Feb 20 02:53:05 localhost puppet-user[52584]: Config retrieval: 0.10 Feb 20 02:53:05 localhost puppet-user[52584]: Transaction evaluation: 0.10 Feb 20 02:53:05 localhost puppet-user[52584]: Catalog application: 0.11 Feb 20 02:53:05 localhost puppet-user[52584]: Last run: 1771573985 Feb 20 02:53:05 localhost puppet-user[52584]: Total: 0.11 Feb 20 02:53:05 localhost puppet-user[52584]: Version: Feb 20 02:53:05 localhost puppet-user[52584]: Config: 1771573985 Feb 20 02:53:05 localhost puppet-user[52584]: Puppet: 7.10.0 Feb 20 02:53:05 localhost puppet-user[52601]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:05 localhost puppet-user[52601]: (file & line not available) Feb 20 02:53:05 localhost puppet-user[52553]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Feb 20 02:53:05 localhost puppet-user[52553]: in a future release. Use nova::cinder::os_region_name instead Feb 20 02:53:05 localhost puppet-user[52553]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Feb 20 02:53:05 localhost puppet-user[52553]: in a future release. Use nova::cinder::catalog_info instead Feb 20 02:53:05 localhost puppet-user[52601]: Notice: Accepting previously invalid value for target type 'Integer' Feb 20 02:53:05 localhost puppet-user[52601]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.27 seconds Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}e814c37697a48f1c054f6d0fe463bdd460287439d0bb244a932a4a914847eb0b' Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Feb 20 02:53:05 localhost puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Feb 20 02:53:06 localhost puppet-user[52601]: Notice: Applied catalog in 0.03 seconds Feb 20 02:53:06 localhost puppet-user[52601]: Application: Feb 20 02:53:06 localhost puppet-user[52601]: Initial environment: production Feb 20 02:53:06 localhost puppet-user[52601]: Converged environment: production Feb 20 02:53:06 localhost puppet-user[52601]: Run mode: user Feb 20 02:53:06 localhost puppet-user[52601]: Changes: Feb 20 02:53:06 localhost puppet-user[52601]: Total: 7 Feb 20 02:53:06 localhost puppet-user[52601]: Events: Feb 20 02:53:06 localhost puppet-user[52601]: Success: 7 Feb 20 02:53:06 localhost puppet-user[52601]: Total: 7 Feb 20 02:53:06 localhost puppet-user[52601]: Resources: Feb 20 02:53:06 localhost puppet-user[52601]: Skipped: 13 Feb 20 02:53:06 localhost puppet-user[52601]: Changed: 5 Feb 20 02:53:06 localhost puppet-user[52601]: Out of sync: 5 Feb 20 02:53:06 localhost puppet-user[52601]: Total: 20 Feb 20 02:53:06 localhost puppet-user[52601]: Time: Feb 20 02:53:06 localhost puppet-user[52601]: File: 0.01 Feb 20 02:53:06 localhost puppet-user[52601]: Transaction evaluation: 0.03 Feb 20 02:53:06 localhost puppet-user[52601]: Catalog application: 0.03 Feb 20 02:53:06 localhost puppet-user[52601]: Config retrieval: 0.30 Feb 20 02:53:06 localhost puppet-user[52601]: Last run: 1771573986 Feb 20 02:53:06 localhost puppet-user[52601]: Total: 0.03 Feb 20 02:53:06 localhost puppet-user[52601]: Version: Feb 20 02:53:06 localhost puppet-user[52601]: Config: 1771573985 Feb 20 02:53:06 localhost puppet-user[52601]: Puppet: 7.10.0 Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Feb 20 02:53:06 localhost puppet-user[52569]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.38 seconds Feb 20 02:53:06 localhost systemd[1]: libpod-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope: Deactivated successfully. Feb 20 02:53:06 localhost systemd[1]: libpod-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope: Consumed 4.279s CPU time. Feb 20 02:53:06 localhost podman[52451]: 2026-02-20 07:53:06.063387683 +0000 UTC m=+6.041716391 container died 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64) Feb 20 02:53:06 localhost systemd[1]: libpod-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope: Deactivated successfully. Feb 20 02:53:06 localhost systemd[1]: libpod-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope: Consumed 2.152s CPU time. Feb 20 02:53:06 localhost podman[52443]: 2026-02-20 07:53:06.092158493 +0000 UTC m=+6.075727951 container died 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_puppet_step1, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Feb 20 02:53:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:06 localhost systemd[1]: var-lib-containers-storage-overlay-bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a-merged.mount: Deactivated successfully. Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Feb 20 02:53:06 localhost podman[53019]: 2026-02-20 07:53:06.185087555 +0000 UTC m=+0.082505135 container cleanup 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, container_name=container-puppet-crond, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 02:53:06 localhost systemd[1]: libpod-conmon-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope: Deactivated successfully. Feb 20 02:53:06 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Feb 20 02:53:06 localhost podman[53002]: 2026-02-20 07:53:06.225406157 +0000 UTC m=+0.154316135 container cleanup 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13) Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Feb 20 02:53:06 localhost systemd[1]: libpod-conmon-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope: Deactivated successfully. Feb 20 02:53:06 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}8dd3769945b86c38433504b97f7851a931eb3c94b667298d10a9796a3d020595' Feb 20 02:53:06 localhost systemd[1]: libpod-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope: Deactivated successfully. Feb 20 02:53:06 localhost systemd[1]: libpod-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope: Consumed 2.194s CPU time. Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Feb 20 02:53:06 localhost podman[52470]: 2026-02-20 07:53:06.324794173 +0000 UTC m=+6.290626374 container died 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public) Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Feb 20 02:53:06 localhost puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Feb 20 02:53:06 localhost puppet-user[52569]: Notice: Applied catalog in 0.28 seconds Feb 20 02:53:06 localhost puppet-user[52569]: Application: Feb 20 02:53:06 localhost puppet-user[52569]: Initial environment: production Feb 20 02:53:06 localhost puppet-user[52569]: Converged environment: production Feb 20 02:53:06 localhost puppet-user[52569]: Run mode: user Feb 20 02:53:06 localhost puppet-user[52569]: Changes: Feb 20 02:53:06 localhost puppet-user[52569]: Total: 43 Feb 20 02:53:06 localhost puppet-user[52569]: Events: Feb 20 02:53:06 localhost puppet-user[52569]: Success: 43 Feb 20 02:53:06 localhost puppet-user[52569]: Total: 43 Feb 20 02:53:06 localhost puppet-user[52569]: Resources: Feb 20 02:53:06 localhost puppet-user[52569]: Skipped: 14 Feb 20 02:53:06 localhost puppet-user[52569]: Changed: 38 Feb 20 02:53:06 localhost puppet-user[52569]: Out of sync: 38 Feb 20 02:53:06 localhost puppet-user[52569]: Total: 82 Feb 20 02:53:06 localhost puppet-user[52569]: Time: Feb 20 02:53:06 localhost puppet-user[52569]: Concat file: 0.00 Feb 20 02:53:06 localhost puppet-user[52569]: File: 0.17 Feb 20 02:53:06 localhost puppet-user[52569]: Transaction evaluation: 0.27 Feb 20 02:53:06 localhost puppet-user[52569]: Catalog application: 0.28 Feb 20 02:53:06 localhost puppet-user[52569]: Config retrieval: 0.49 Feb 20 02:53:06 localhost puppet-user[52569]: Last run: 1771573986 Feb 20 02:53:06 localhost puppet-user[52569]: Concat fragment: 0.00 Feb 20 02:53:06 localhost puppet-user[52569]: Total: 0.28 Feb 20 02:53:06 localhost puppet-user[52569]: Version: Feb 20 02:53:06 localhost puppet-user[52569]: Config: 1771573985 Feb 20 02:53:06 localhost puppet-user[52569]: Puppet: 7.10.0 Feb 20 02:53:06 localhost podman[53105]: 2026-02-20 07:53:06.388081169 +0000 UTC m=+0.056157623 container cleanup 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 02:53:06 localhost systemd[1]: libpod-conmon-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope: Deactivated successfully. Feb 20 02:53:06 localhost puppet-user[52553]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Feb 20 02:53:06 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 20 02:53:06 localhost systemd[1]: libpod-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope: Deactivated successfully. Feb 20 02:53:06 localhost systemd[1]: libpod-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope: Consumed 2.631s CPU time. Feb 20 02:53:06 localhost podman[52450]: 2026-02-20 07:53:06.881216824 +0000 UTC m=+6.861635312 container died aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=container-puppet-collectd, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 02:53:06 localhost podman[53200]: 2026-02-20 07:53:06.95257074 +0000 UTC m=+0.071470040 container create 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, container_name=container-puppet-ovn_controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team) Feb 20 02:53:06 localhost systemd[1]: Started libpod-conmon-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope. Feb 20 02:53:06 localhost puppet-user[52553]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 1.35 seconds Feb 20 02:53:06 localhost systemd[1]: Started libcrun container. Feb 20 02:53:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:06 localhost podman[53246]: 2026-02-20 07:53:06.996594377 +0000 UTC m=+0.109110495 container create 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, architecture=x86_64, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.13, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=container-puppet-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog) Feb 20 02:53:07 localhost podman[53200]: 2026-02-20 07:53:07.003094412 +0000 UTC m=+0.121993712 container init 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1766032510, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 02:53:07 localhost podman[53200]: 2026-02-20 07:53:06.914665389 +0000 UTC m=+0.033564729 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 20 02:53:07 localhost podman[53246]: 2026-02-20 07:53:06.914806673 +0000 UTC m=+0.027322761 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 20 02:53:07 localhost podman[53257]: 2026-02-20 07:53:07.033281634 +0000 UTC m=+0.144196936 container cleanup aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, release=1766032510, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=container-puppet-collectd, distribution-scope=public) Feb 20 02:53:07 localhost systemd[1]: libpod-conmon-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope: Deactivated successfully. Feb 20 02:53:07 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 20 02:53:07 localhost podman[53200]: 2026-02-20 07:53:07.060420079 +0000 UTC m=+0.179319369 container start 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, architecture=x86_64, release=1766032510, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Feb 20 02:53:07 localhost podman[53200]: 2026-02-20 07:53:07.061709685 +0000 UTC m=+0.180609005 container attach 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-ovn_controller, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 02:53:07 localhost systemd[1]: Started libpod-conmon-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope. Feb 20 02:53:07 localhost systemd[1]: Started libcrun container. Feb 20 02:53:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bb1f8a81ebf31c6df88a84cd13b1c78ab0b7c78b4f247f0212f5208091a25c0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:07 localhost podman[53246]: 2026-02-20 07:53:07.108697476 +0000 UTC m=+0.221213634 container init 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 20 02:53:07 localhost podman[53246]: 2026-02-20 07:53:07.121513943 +0000 UTC m=+0.234030031 container start 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, container_name=container-puppet-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog) Feb 20 02:53:07 localhost systemd[1]: tmp-crun.YX1cgo.mount: Deactivated successfully. Feb 20 02:53:07 localhost systemd[1]: var-lib-containers-storage-overlay-17d67d7c6c3046ba2041c4048263641e426665d92e1e8fa18e3c871ca9222f66-merged.mount: Deactivated successfully. Feb 20 02:53:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:07 localhost systemd[1]: var-lib-containers-storage-overlay-0b4a27664720ce930aee8034c0e3a2e981bce86564061fc7e3c5cc60116ab629-merged.mount: Deactivated successfully. Feb 20 02:53:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:07 localhost systemd[1]: var-lib-containers-storage-overlay-2f138d9d6c461962e8cf2ee8539c9294af2f13aab0c8b266d53219a78c733e21-merged.mount: Deactivated successfully. Feb 20 02:53:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:07 localhost podman[53246]: 2026-02-20 07:53:07.123780927 +0000 UTC m=+0.236297095 container attach 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:09Z, distribution-scope=public, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1) Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}37542e92f883a9129d79835364a7293bd4c337025ae650a647285cb3357f99b9' Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Feb 20 02:53:08 localhost puppet-user[52553]: Warning: Empty environment setting 'TLS_PASSWORD' Feb 20 02:53:08 localhost puppet-user[52553]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}5bbbbc79dd1f184aec3b40a4e5d830cb87a3dca9076a18726a5379ee062cd087' Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Feb 20 02:53:08 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Feb 20 02:53:09 localhost podman[52357]: 2026-02-20 07:52:59.975591018 +0000 UTC m=+0.043864793 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Feb 20 02:53:09 localhost podman[53465]: 2026-02-20 07:53:09.864595361 +0000 UTC m=+0.071134338 container create b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, name=rhosp-rhel9/openstack-ceilometer-central, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:07:24Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Feb 20 02:53:09 localhost systemd[1]: Started libpod-conmon-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope. Feb 20 02:53:09 localhost systemd[1]: Started libcrun container. Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Feb 20 02:53:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Feb 20 02:53:09 localhost podman[53465]: 2026-02-20 07:53:09.917528429 +0000 UTC m=+0.124067426 container init b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, release=1766032510, build-date=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central) Feb 20 02:53:09 localhost podman[53465]: 2026-02-20 07:53:09.823806318 +0000 UTC m=+0.030345335 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 20 02:53:09 localhost podman[53465]: 2026-02-20 07:53:09.924078825 +0000 UTC m=+0.130617812 container start b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:24Z, tcib_managed=true, config_id=tripleo_puppet_step1, vcs-type=git, url=https://www.redhat.com, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T23:07:24Z, name=rhosp-rhel9/openstack-ceilometer-central, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13) Feb 20 02:53:09 localhost podman[53465]: 2026-02-20 07:53:09.924464416 +0000 UTC m=+0.131003463 container attach b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, release=1766032510, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:07:24Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 20 02:53:09 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Feb 20 02:53:10 localhost puppet-user[53366]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:10 localhost puppet-user[53366]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:10 localhost puppet-user[53366]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:10 localhost puppet-user[53366]: (file & line not available) Feb 20 02:53:10 localhost puppet-user[53366]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:10 localhost puppet-user[53366]: (file & line not available) Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:10 localhost puppet-user[53324]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:10 localhost puppet-user[53324]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:10 localhost puppet-user[53324]: (file & line not available) Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Feb 20 02:53:10 localhost puppet-user[53366]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.24 seconds Feb 20 02:53:10 localhost puppet-user[53324]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:10 localhost puppet-user[53324]: (file & line not available) Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Feb 20 02:53:10 localhost puppet-user[53366]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}64c5f9c37bfdcd550f09aea32895662c8b3e80da678034168cc6138d9da68080' Feb 20 02:53:10 localhost puppet-user[53366]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Feb 20 02:53:10 localhost puppet-user[53366]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}9a4304e8b8ecdc53ca05b5d959decd5d06e8240e50d42ed6e1a54c334031e88c' Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Feb 20 02:53:10 localhost puppet-user[53366]: Notice: Applied catalog in 0.12 seconds Feb 20 02:53:10 localhost puppet-user[53366]: Application: Feb 20 02:53:10 localhost puppet-user[53366]: Initial environment: production Feb 20 02:53:10 localhost puppet-user[53366]: Converged environment: production Feb 20 02:53:10 localhost puppet-user[53366]: Run mode: user Feb 20 02:53:10 localhost puppet-user[53366]: Changes: Feb 20 02:53:10 localhost puppet-user[53366]: Total: 3 Feb 20 02:53:10 localhost puppet-user[53366]: Events: Feb 20 02:53:10 localhost puppet-user[53366]: Success: 3 Feb 20 02:53:10 localhost puppet-user[53366]: Total: 3 Feb 20 02:53:10 localhost puppet-user[53366]: Resources: Feb 20 02:53:10 localhost puppet-user[53366]: Skipped: 11 Feb 20 02:53:10 localhost puppet-user[53366]: Changed: 3 Feb 20 02:53:10 localhost puppet-user[53366]: Out of sync: 3 Feb 20 02:53:10 localhost puppet-user[53366]: Total: 25 Feb 20 02:53:10 localhost puppet-user[53366]: Time: Feb 20 02:53:10 localhost puppet-user[53366]: Concat file: 0.00 Feb 20 02:53:10 localhost puppet-user[53366]: Concat fragment: 0.00 Feb 20 02:53:10 localhost puppet-user[53366]: File: 0.02 Feb 20 02:53:10 localhost puppet-user[53366]: Transaction evaluation: 0.12 Feb 20 02:53:10 localhost puppet-user[53366]: Catalog application: 0.12 Feb 20 02:53:10 localhost puppet-user[53366]: Config retrieval: 0.29 Feb 20 02:53:10 localhost puppet-user[53366]: Last run: 1771573990 Feb 20 02:53:10 localhost puppet-user[53366]: Total: 0.12 Feb 20 02:53:10 localhost puppet-user[53366]: Version: Feb 20 02:53:10 localhost puppet-user[53366]: Config: 1771573990 Feb 20 02:53:10 localhost puppet-user[53366]: Puppet: 7.10.0 Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.26 seconds Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53629]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53631]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53633]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.108 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53650]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005625204.localdomain Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005625204.novalocal' to 'np0005625204.localdomain' Feb 20 02:53:10 localhost ovs-vsctl[53657]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Feb 20 02:53:10 localhost systemd[1]: libpod-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope: Deactivated successfully. Feb 20 02:53:10 localhost systemd[1]: libpod-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope: Consumed 3.474s CPU time. Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Feb 20 02:53:10 localhost podman[53246]: 2026-02-20 07:53:10.725741391 +0000 UTC m=+3.838257509 container died 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, container_name=container-puppet-rsyslog, release=1766032510, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z) Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53674]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53676]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53678]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53680]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53682]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53684]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:4f:a1:a1 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53686]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53688]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Feb 20 02:53:10 localhost ovs-vsctl[53690]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Feb 20 02:53:10 localhost puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Notice: Applied catalog in 0.40 seconds Feb 20 02:53:10 localhost puppet-user[53324]: Application: Feb 20 02:53:10 localhost puppet-user[53324]: Initial environment: production Feb 20 02:53:10 localhost puppet-user[53324]: Converged environment: production Feb 20 02:53:10 localhost puppet-user[53324]: Run mode: user Feb 20 02:53:10 localhost puppet-user[53324]: Changes: Feb 20 02:53:10 localhost puppet-user[53324]: Total: 14 Feb 20 02:53:10 localhost puppet-user[53324]: Events: Feb 20 02:53:10 localhost puppet-user[53324]: Success: 14 Feb 20 02:53:10 localhost puppet-user[53324]: Total: 14 Feb 20 02:53:10 localhost puppet-user[53324]: Resources: Feb 20 02:53:10 localhost puppet-user[53324]: Skipped: 12 Feb 20 02:53:10 localhost puppet-user[53324]: Changed: 14 Feb 20 02:53:10 localhost puppet-user[53324]: Out of sync: 14 Feb 20 02:53:10 localhost puppet-user[53324]: Total: 29 Feb 20 02:53:10 localhost puppet-user[53324]: Time: Feb 20 02:53:10 localhost puppet-user[53324]: Exec: 0.01 Feb 20 02:53:10 localhost puppet-user[53324]: Config retrieval: 0.29 Feb 20 02:53:10 localhost puppet-user[53324]: Vs config: 0.35 Feb 20 02:53:10 localhost puppet-user[53324]: Transaction evaluation: 0.39 Feb 20 02:53:10 localhost puppet-user[53324]: Catalog application: 0.40 Feb 20 02:53:10 localhost puppet-user[53324]: Last run: 1771573990 Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Feb 20 02:53:10 localhost puppet-user[53324]: Total: 0.40 Feb 20 02:53:10 localhost puppet-user[53324]: Version: Feb 20 02:53:10 localhost puppet-user[53324]: Config: 1771573990 Feb 20 02:53:10 localhost puppet-user[53324]: Puppet: 7.10.0 Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Feb 20 02:53:10 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Feb 20 02:53:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:11 localhost systemd[1]: var-lib-containers-storage-overlay-4bb1f8a81ebf31c6df88a84cd13b1c78ab0b7c78b4f247f0212f5208091a25c0-merged.mount: Deactivated successfully. Feb 20 02:53:11 localhost systemd[1]: libpod-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope: Deactivated successfully. Feb 20 02:53:11 localhost systemd[1]: libpod-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope: Consumed 2.832s CPU time. Feb 20 02:53:11 localhost podman[53200]: 2026-02-20 07:53:11.40330573 +0000 UTC m=+4.522205060 container died 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 02:53:11 localhost podman[53664]: 2026-02-20 07:53:11.415841868 +0000 UTC m=+0.685105095 container cleanup 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-rsyslog, version=17.1.13) Feb 20 02:53:11 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 20 02:53:11 localhost systemd[1]: libpod-conmon-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope: Deactivated successfully. Feb 20 02:53:11 localhost podman[53287]: 2026-02-20 07:53:06.971367487 +0000 UTC m=+0.037037918 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Feb 20 02:53:11 localhost systemd[1]: tmp-crun.2y8Apn.mount: Deactivated successfully. Feb 20 02:53:11 localhost podman[53733]: 2026-02-20 07:53:11.591133441 +0000 UTC m=+0.178266989 container cleanup 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 02:53:11 localhost systemd[1]: libpod-conmon-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope: Deactivated successfully. Feb 20 02:53:11 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 20 02:53:11 localhost podman[53788]: 2026-02-20 07:53:11.614759746 +0000 UTC m=+0.071485072 container create 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-server, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_puppet_step1, tcib_managed=true, container_name=container-puppet-neutron, org.opencontainers.image.created=2026-01-12T22:57:35Z) Feb 20 02:53:11 localhost systemd[1]: Started libpod-conmon-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope. Feb 20 02:53:11 localhost systemd[1]: Started libcrun container. Feb 20 02:53:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:11 localhost podman[53788]: 2026-02-20 07:53:11.674559822 +0000 UTC m=+0.131285148 container init 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=container-puppet-neutron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:57:35Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T22:57:35Z, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 02:53:11 localhost podman[53788]: 2026-02-20 07:53:11.575502605 +0000 UTC m=+0.032227941 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 20 02:53:11 localhost podman[53788]: 2026-02-20 07:53:11.682357145 +0000 UTC m=+0.139082441 container start 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, name=rhosp-rhel9/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, org.opencontainers.image.created=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, build-date=2026-01-12T22:57:35Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, container_name=container-puppet-neutron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=) Feb 20 02:53:11 localhost podman[53788]: 2026-02-20 07:53:11.68287947 +0000 UTC m=+0.139604846 container attach 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-server, org.opencontainers.image.created=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server) Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Feb 20 02:53:11 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Feb 20 02:53:12 localhost systemd[1]: var-lib-containers-storage-overlay-53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c-merged.mount: Deactivated successfully. Feb 20 02:53:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:12 localhost puppet-user[53550]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:12 localhost puppet-user[53550]: (file & line not available) Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:12 localhost puppet-user[53550]: (file & line not available) Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Feb 20 02:53:12 localhost puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.38 seconds Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Feb 20 02:53:12 localhost puppet-user[52553]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}66a7ab6cc1a19ea5002a5aaa2cfb2f196778c89c859d0afac926fe3fac9c75a4' Feb 20 02:53:13 localhost puppet-user[53550]: Notice: Applied catalog in 0.44 seconds Feb 20 02:53:13 localhost puppet-user[53550]: Application: Feb 20 02:53:13 localhost puppet-user[53550]: Initial environment: production Feb 20 02:53:13 localhost puppet-user[53550]: Converged environment: production Feb 20 02:53:13 localhost puppet-user[53550]: Run mode: user Feb 20 02:53:13 localhost puppet-user[53550]: Changes: Feb 20 02:53:13 localhost puppet-user[53550]: Total: 31 Feb 20 02:53:13 localhost puppet-user[53550]: Events: Feb 20 02:53:13 localhost puppet-user[53550]: Success: 31 Feb 20 02:53:13 localhost puppet-user[53550]: Total: 31 Feb 20 02:53:13 localhost puppet-user[53550]: Resources: Feb 20 02:53:13 localhost puppet-user[53550]: Skipped: 22 Feb 20 02:53:13 localhost puppet-user[53550]: Changed: 31 Feb 20 02:53:13 localhost puppet-user[53550]: Out of sync: 31 Feb 20 02:53:13 localhost puppet-user[53550]: Total: 151 Feb 20 02:53:13 localhost puppet-user[53550]: Time: Feb 20 02:53:13 localhost puppet-user[53550]: Package: 0.01 Feb 20 02:53:13 localhost puppet-user[53550]: Ceilometer config: 0.35 Feb 20 02:53:13 localhost puppet-user[53550]: Transaction evaluation: 0.43 Feb 20 02:53:13 localhost puppet-user[53550]: Catalog application: 0.44 Feb 20 02:53:13 localhost puppet-user[53550]: Config retrieval: 0.46 Feb 20 02:53:13 localhost puppet-user[53550]: Last run: 1771573993 Feb 20 02:53:13 localhost puppet-user[53550]: Resources: 0.00 Feb 20 02:53:13 localhost puppet-user[53550]: Total: 0.44 Feb 20 02:53:13 localhost puppet-user[53550]: Version: Feb 20 02:53:13 localhost puppet-user[52553]: Notice: Applied catalog in 5.77 seconds Feb 20 02:53:13 localhost puppet-user[53550]: Config: 1771573992 Feb 20 02:53:13 localhost puppet-user[53550]: Puppet: 7.10.0 Feb 20 02:53:13 localhost puppet-user[52553]: Application: Feb 20 02:53:13 localhost puppet-user[52553]: Initial environment: production Feb 20 02:53:13 localhost puppet-user[52553]: Converged environment: production Feb 20 02:53:13 localhost puppet-user[52553]: Run mode: user Feb 20 02:53:13 localhost puppet-user[52553]: Changes: Feb 20 02:53:13 localhost puppet-user[52553]: Total: 183 Feb 20 02:53:13 localhost puppet-user[52553]: Events: Feb 20 02:53:13 localhost puppet-user[52553]: Success: 183 Feb 20 02:53:13 localhost puppet-user[52553]: Total: 183 Feb 20 02:53:13 localhost puppet-user[52553]: Resources: Feb 20 02:53:13 localhost puppet-user[52553]: Changed: 183 Feb 20 02:53:13 localhost puppet-user[52553]: Out of sync: 183 Feb 20 02:53:13 localhost puppet-user[52553]: Skipped: 57 Feb 20 02:53:13 localhost puppet-user[52553]: Total: 487 Feb 20 02:53:13 localhost puppet-user[52553]: Time: Feb 20 02:53:13 localhost puppet-user[52553]: Concat fragment: 0.00 Feb 20 02:53:13 localhost puppet-user[52553]: Anchor: 0.00 Feb 20 02:53:13 localhost puppet-user[52553]: File line: 0.00 Feb 20 02:53:13 localhost puppet-user[52553]: Virtlogd config: 0.01 Feb 20 02:53:13 localhost puppet-user[52553]: Virtstoraged config: 0.01 Feb 20 02:53:13 localhost puppet-user[52553]: Exec: 0.01 Feb 20 02:53:13 localhost puppet-user[52553]: Virtnodedevd config: 0.02 Feb 20 02:53:13 localhost puppet-user[52553]: Virtqemud config: 0.02 Feb 20 02:53:13 localhost puppet-user[52553]: File: 0.02 Feb 20 02:53:13 localhost puppet-user[52553]: Virtsecretd config: 0.02 Feb 20 02:53:13 localhost puppet-user[52553]: Package: 0.02 Feb 20 02:53:13 localhost puppet-user[52553]: Virtproxyd config: 0.03 Feb 20 02:53:13 localhost puppet-user[52553]: Augeas: 0.92 Feb 20 02:53:13 localhost puppet-user[52553]: Config retrieval: 1.60 Feb 20 02:53:13 localhost puppet-user[52553]: Last run: 1771573993 Feb 20 02:53:13 localhost puppet-user[52553]: Nova config: 3.12 Feb 20 02:53:13 localhost puppet-user[52553]: Resources: 0.00 Feb 20 02:53:13 localhost puppet-user[52553]: Transaction evaluation: 5.76 Feb 20 02:53:13 localhost puppet-user[52553]: Catalog application: 5.77 Feb 20 02:53:13 localhost puppet-user[52553]: Concat file: 0.00 Feb 20 02:53:13 localhost puppet-user[52553]: Total: 5.77 Feb 20 02:53:13 localhost puppet-user[52553]: Version: Feb 20 02:53:13 localhost puppet-user[52553]: Config: 1771573985 Feb 20 02:53:13 localhost puppet-user[52553]: Puppet: 7.10.0 Feb 20 02:53:13 localhost systemd[1]: libpod-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope: Deactivated successfully. Feb 20 02:53:13 localhost systemd[1]: libpod-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope: Consumed 3.174s CPU time. Feb 20 02:53:13 localhost podman[53465]: 2026-02-20 07:53:13.406970957 +0000 UTC m=+3.613509964 container died b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:07:24Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central) Feb 20 02:53:13 localhost puppet-user[53843]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Feb 20 02:53:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:13 localhost systemd[1]: var-lib-containers-storage-overlay-d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340-merged.mount: Deactivated successfully. Feb 20 02:53:13 localhost podman[53954]: 2026-02-20 07:53:13.498526261 +0000 UTC m=+0.082971400 container cleanup b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, config_id=tripleo_puppet_step1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, build-date=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=container-puppet-ceilometer) Feb 20 02:53:13 localhost systemd[1]: libpod-conmon-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope: Deactivated successfully. Feb 20 02:53:13 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 20 02:53:13 localhost puppet-user[53843]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:53:13 localhost puppet-user[53843]: (file: /etc/puppet/hiera.yaml) Feb 20 02:53:13 localhost puppet-user[53843]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:53:13 localhost puppet-user[53843]: (file & line not available) Feb 20 02:53:13 localhost puppet-user[53843]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:53:13 localhost puppet-user[53843]: (file & line not available) Feb 20 02:53:13 localhost puppet-user[53843]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Feb 20 02:53:13 localhost systemd[1]: libpod-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope: Deactivated successfully. Feb 20 02:53:13 localhost systemd[1]: libpod-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope: Consumed 9.844s CPU time. Feb 20 02:53:13 localhost podman[52439]: 2026-02-20 07:53:13.817164595 +0000 UTC m=+13.808403101 container died 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_puppet_step1, io.openshift.expose-services=, container_name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 02:53:13 localhost podman[54092]: 2026-02-20 07:53:13.924884239 +0000 UTC m=+0.102663970 container cleanup 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt) Feb 20 02:53:13 localhost systemd[1]: libpod-conmon-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope: Deactivated successfully. Feb 20 02:53:13 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 02:53:14 localhost sshd[54133]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:14 localhost puppet-user[53843]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.63 seconds Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Feb 20 02:53:14 localhost systemd[1]: tmp-crun.jiSs8e.mount: Deactivated successfully. Feb 20 02:53:14 localhost systemd[1]: var-lib-containers-storage-overlay-4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3-merged.mount: Deactivated successfully. Feb 20 02:53:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Feb 20 02:53:14 localhost puppet-user[53843]: Notice: Applied catalog in 0.43 seconds Feb 20 02:53:14 localhost puppet-user[53843]: Application: Feb 20 02:53:14 localhost puppet-user[53843]: Initial environment: production Feb 20 02:53:14 localhost puppet-user[53843]: Converged environment: production Feb 20 02:53:14 localhost puppet-user[53843]: Run mode: user Feb 20 02:53:14 localhost puppet-user[53843]: Changes: Feb 20 02:53:14 localhost puppet-user[53843]: Total: 33 Feb 20 02:53:14 localhost puppet-user[53843]: Events: Feb 20 02:53:14 localhost puppet-user[53843]: Success: 33 Feb 20 02:53:14 localhost puppet-user[53843]: Total: 33 Feb 20 02:53:14 localhost puppet-user[53843]: Resources: Feb 20 02:53:14 localhost puppet-user[53843]: Skipped: 21 Feb 20 02:53:14 localhost puppet-user[53843]: Changed: 33 Feb 20 02:53:14 localhost puppet-user[53843]: Out of sync: 33 Feb 20 02:53:14 localhost puppet-user[53843]: Total: 155 Feb 20 02:53:14 localhost puppet-user[53843]: Time: Feb 20 02:53:14 localhost puppet-user[53843]: Resources: 0.00 Feb 20 02:53:14 localhost puppet-user[53843]: Ovn metadata agent config: 0.02 Feb 20 02:53:14 localhost puppet-user[53843]: Neutron config: 0.35 Feb 20 02:53:14 localhost puppet-user[53843]: Transaction evaluation: 0.43 Feb 20 02:53:14 localhost puppet-user[53843]: Catalog application: 0.43 Feb 20 02:53:14 localhost puppet-user[53843]: Config retrieval: 0.70 Feb 20 02:53:14 localhost puppet-user[53843]: Last run: 1771573994 Feb 20 02:53:14 localhost puppet-user[53843]: Total: 0.43 Feb 20 02:53:14 localhost puppet-user[53843]: Version: Feb 20 02:53:14 localhost puppet-user[53843]: Config: 1771573993 Feb 20 02:53:14 localhost puppet-user[53843]: Puppet: 7.10.0 Feb 20 02:53:15 localhost systemd[1]: libpod-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope: Deactivated successfully. Feb 20 02:53:15 localhost systemd[1]: libpod-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope: Consumed 3.589s CPU time. Feb 20 02:53:15 localhost podman[53788]: 2026-02-20 07:53:15.314631284 +0000 UTC m=+3.771356650 container died 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T22:57:35Z, summary=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:57:35Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-neutron-server) Feb 20 02:53:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:15 localhost systemd[1]: var-lib-containers-storage-overlay-c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d-merged.mount: Deactivated successfully. Feb 20 02:53:15 localhost podman[54169]: 2026-02-20 07:53:15.45146744 +0000 UTC m=+0.125726479 container cleanup 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-server-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp-rhel9/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, build-date=2026-01-12T22:57:35Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:57:35Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1) Feb 20 02:53:15 localhost systemd[1]: libpod-conmon-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope: Deactivated successfully. Feb 20 02:53:15 localhost python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 20 02:53:16 localhost python3[54224]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:17 localhost python3[54256]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:53:18 localhost python3[54306]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:53:19 localhost python3[54349]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573998.3857064-85051-93623064883299/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:19 localhost python3[54411]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:53:19 localhost python3[54454]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573999.2260113-85051-25605579334971/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:20 localhost python3[54516]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:53:20 localhost python3[54559]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574000.1863053-85071-7139879288483/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:21 localhost python3[54621]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:53:21 localhost python3[54664]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574001.1139882-85083-15401580707635/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:22 localhost python3[54694]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:53:22 localhost systemd[1]: Reloading. Feb 20 02:53:22 localhost systemd-sysv-generator[54721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:53:22 localhost systemd-rc-local-generator[54718]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:53:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:53:22 localhost systemd[1]: Reloading. Feb 20 02:53:22 localhost systemd-sysv-generator[54761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:53:22 localhost systemd-rc-local-generator[54755]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:53:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:53:22 localhost systemd[1]: Starting TripleO Container Shutdown... Feb 20 02:53:22 localhost systemd[1]: Finished TripleO Container Shutdown. Feb 20 02:53:23 localhost python3[54818]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:53:23 localhost sshd[54861]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:23 localhost python3[54862]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574002.9107478-85163-182647580955573/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:24 localhost python3[54924]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:53:24 localhost python3[54967]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574003.841421-85174-231287861662889/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:25 localhost python3[54997]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:53:25 localhost systemd[1]: Reloading. Feb 20 02:53:25 localhost systemd-rc-local-generator[55023]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:53:25 localhost systemd-sysv-generator[55028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:53:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:53:25 localhost systemd[1]: Reloading. Feb 20 02:53:25 localhost systemd-rc-local-generator[55064]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:53:25 localhost systemd-sysv-generator[55067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:53:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:53:25 localhost systemd[1]: Starting Create netns directory... Feb 20 02:53:25 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 02:53:25 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 02:53:25 localhost systemd[1]: Finished Create netns directory. Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: c3cf83e3d6b9a6a9323d670f77d9e810 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: d31718fcd17fdeee6489534105191c7a Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 24eefedeb2e4ab8bab62979b617bbba7 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: eb8c5e608f55bc52c95871f92a543185 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: ed809cd151e1fa8da7409fe229c809b7 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: ed809cd151e1fa8da7409fe229c809b7 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 684ebb6e94768a0a31a4d8592f0686b3 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:26 localhost python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 6f2a8ada21c5a8beb0844e05e372be87 Feb 20 02:53:27 localhost python3[55147]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 20 02:53:27 localhost podman[55184]: 2026-02-20 07:53:27.968342119 +0000 UTC m=+0.087385277 container create 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, container_name=metrics_qdr_init_logs, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 02:53:28 localhost systemd[1]: Started libpod-conmon-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c.scope. Feb 20 02:53:28 localhost systemd[1]: Started libcrun container. Feb 20 02:53:28 localhost podman[55184]: 2026-02-20 07:53:27.924738659 +0000 UTC m=+0.043781797 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 20 02:53:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66b4607051ec4b678b98370429ea66c5b0f53009a9a85441acbc9ac68d517903/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:28 localhost podman[55184]: 2026-02-20 07:53:28.036840105 +0000 UTC m=+0.155883233 container init 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr_init_logs, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 02:53:28 localhost podman[55184]: 2026-02-20 07:53:28.051126234 +0000 UTC m=+0.170169382 container start 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 02:53:28 localhost podman[55184]: 2026-02-20 07:53:28.051862167 +0000 UTC m=+0.170905375 container attach 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, container_name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, distribution-scope=public) Feb 20 02:53:28 localhost systemd[1]: libpod-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c.scope: Deactivated successfully. Feb 20 02:53:28 localhost podman[55184]: 2026-02-20 07:53:28.05618728 +0000 UTC m=+0.175230438 container died 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, container_name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 02:53:28 localhost podman[55204]: 2026-02-20 07:53:28.14697525 +0000 UTC m=+0.076043458 container cleanup 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 02:53:28 localhost systemd[1]: libpod-conmon-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c.scope: Deactivated successfully. Feb 20 02:53:28 localhost python3[55147]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Feb 20 02:53:28 localhost podman[55280]: 2026-02-20 07:53:28.517600143 +0000 UTC m=+0.071719815 container create f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 02:53:28 localhost systemd[1]: Started libpod-conmon-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope. Feb 20 02:53:28 localhost systemd[1]: Started libcrun container. Feb 20 02:53:28 localhost podman[55280]: 2026-02-20 07:53:28.478197282 +0000 UTC m=+0.032317004 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 20 02:53:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 20 02:53:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:53:28 localhost podman[55280]: 2026-02-20 07:53:28.617995819 +0000 UTC m=+0.172115501 container init f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 20 02:53:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:53:28 localhost podman[55280]: 2026-02-20 07:53:28.653578243 +0000 UTC m=+0.207697875 container start f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13) Feb 20 02:53:28 localhost python3[55147]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c3cf83e3d6b9a6a9323d670f77d9e810 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 20 02:53:28 localhost podman[55303]: 2026-02-20 07:53:28.746885691 +0000 UTC m=+0.087172130 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z) Feb 20 02:53:28 localhost podman[55303]: 2026-02-20 07:53:28.966349227 +0000 UTC m=+0.306635706 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container) Feb 20 02:53:28 localhost systemd[1]: tmp-crun.3SZiRu.mount: Deactivated successfully. Feb 20 02:53:28 localhost systemd[1]: var-lib-containers-storage-overlay-66b4607051ec4b678b98370429ea66c5b0f53009a9a85441acbc9ac68d517903-merged.mount: Deactivated successfully. Feb 20 02:53:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c-userdata-shm.mount: Deactivated successfully. Feb 20 02:53:28 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:53:29 localhost python3[55374]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:29 localhost python3[55390]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:53:30 localhost python3[55451]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574009.5332015-85365-187906120463055/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:30 localhost python3[55467]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 02:53:30 localhost systemd[1]: Reloading. Feb 20 02:53:30 localhost systemd-sysv-generator[55495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:53:30 localhost systemd-rc-local-generator[55492]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:53:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:53:31 localhost python3[55519]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:53:31 localhost systemd[1]: Reloading. Feb 20 02:53:31 localhost systemd-rc-local-generator[55551]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:53:31 localhost systemd-sysv-generator[55554]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:53:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:53:31 localhost systemd[1]: Starting metrics_qdr container... Feb 20 02:53:31 localhost systemd[1]: Started metrics_qdr container. Feb 20 02:53:32 localhost python3[55599]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:33 localhost python3[55720]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005625204 step=1 update_config_hash_only=False Feb 20 02:53:33 localhost python3[55736]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:53:34 localhost python3[55752]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 20 02:53:35 localhost sshd[55753]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:43 localhost sshd[55755]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:47 localhost sshd[55757]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:58 localhost sshd[55837]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:53:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:53:59 localhost podman[55838]: 2026-02-20 07:53:59.099194544 +0000 UTC m=+0.044767538 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 02:53:59 localhost podman[55838]: 2026-02-20 07:53:59.272985796 +0000 UTC m=+0.218558780 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 02:53:59 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:54:09 localhost sshd[55867]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:54:16 localhost sshd[55869]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:54:28 localhost sshd[55871]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:54:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:54:30 localhost podman[55873]: 2026-02-20 07:54:30.200995592 +0000 UTC m=+0.119853072 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, distribution-scope=public, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64) Feb 20 02:54:30 localhost podman[55873]: 2026-02-20 07:54:30.420405359 +0000 UTC m=+0.339262859 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 02:54:30 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:54:33 localhost sshd[55902]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:54:35 localhost sshd[55904]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:54:42 localhost sshd[55906]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:54:51 localhost sshd[55908]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:55:00 localhost systemd[1]: tmp-crun.T5TZ5L.mount: Deactivated successfully. Feb 20 02:55:00 localhost podman[55988]: 2026-02-20 07:55:00.875829532 +0000 UTC m=+0.079355376 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64) Feb 20 02:55:01 localhost podman[55988]: 2026-02-20 07:55:01.102280089 +0000 UTC m=+0.305805893 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, version=17.1.13, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 02:55:01 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:55:01 localhost sshd[56019]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:03 localhost sshd[56021]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:08 localhost sshd[56023]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:13 localhost sshd[56025]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:23 localhost sshd[56027]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:24 localhost sshd[56029]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:30 localhost sshd[56031]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:55:32 localhost podman[56033]: 2026-02-20 07:55:32.135846948 +0000 UTC m=+0.077499400 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, container_name=metrics_qdr, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 02:55:32 localhost podman[56033]: 2026-02-20 07:55:32.339238444 +0000 UTC m=+0.280890866 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, release=1766032510, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5) Feb 20 02:55:32 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:55:37 localhost sshd[56062]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:46 localhost sshd[56064]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:55:52 localhost sshd[56066]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:00 localhost sshd[56146]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:56:03 localhost systemd[1]: tmp-crun.8tedRg.mount: Deactivated successfully. Feb 20 02:56:03 localhost podman[56148]: 2026-02-20 07:56:03.150197423 +0000 UTC m=+0.085544573 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z) Feb 20 02:56:03 localhost podman[56148]: 2026-02-20 07:56:03.31988743 +0000 UTC m=+0.255234540 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.5) Feb 20 02:56:03 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:56:11 localhost sshd[56177]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:12 localhost sshd[56178]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:18 localhost sshd[56181]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:28 localhost sshd[56183]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:56:33 localhost podman[56185]: 2026-02-20 07:56:33.982239827 +0000 UTC m=+0.073065904 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc.) Feb 20 02:56:34 localhost podman[56185]: 2026-02-20 07:56:34.20415753 +0000 UTC m=+0.294983557 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 20 02:56:34 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:56:35 localhost sshd[56214]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:45 localhost sshd[56216]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:54 localhost sshd[56218]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:56 localhost sshd[56297]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:59 localhost sshd[56299]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:56:59 localhost sshd[56301]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:57:05 localhost systemd[1]: tmp-crun.CpA3zb.mount: Deactivated successfully. Feb 20 02:57:05 localhost podman[56303]: 2026-02-20 07:57:05.201162173 +0000 UTC m=+0.136396925 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z) Feb 20 02:57:05 localhost podman[56303]: 2026-02-20 07:57:05.422225734 +0000 UTC m=+0.357460526 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Feb 20 02:57:05 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:57:10 localhost sshd[56332]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:20 localhost sshd[56334]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:33 localhost sshd[56336]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:57:36 localhost systemd[1]: tmp-crun.qMVpx0.mount: Deactivated successfully. Feb 20 02:57:36 localhost podman[56338]: 2026-02-20 07:57:36.413461245 +0000 UTC m=+0.352575139 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 02:57:36 localhost podman[56338]: 2026-02-20 07:57:36.623765243 +0000 UTC m=+0.562879167 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.buildah.version=1.41.5) Feb 20 02:57:36 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:57:37 localhost sshd[56367]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:39 localhost sshd[56369]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:43 localhost sshd[56371]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:48 localhost sshd[56373]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:57:58 localhost sshd[56452]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:04 localhost ceph-osd[33177]: osd.3 pg_epoch: 18 pg[2.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [4,5,3] r=2 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:05 localhost sshd[56454]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:06 localhost ceph-osd[32226]: osd.0 pg_epoch: 20 pg[3.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [5,4,0] r=2 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:06 localhost ceph-osd[33177]: osd.3 pg_epoch: 21 pg[4.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [3,4,5] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:58:07 localhost podman[56456]: 2026-02-20 07:58:07.145337158 +0000 UTC m=+0.089321586 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 02:58:07 localhost podman[56456]: 2026-02-20 07:58:07.36051411 +0000 UTC m=+0.304498528 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510) Feb 20 02:58:07 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:58:07 localhost ceph-osd[33177]: osd.3 pg_epoch: 22 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [3,4,5] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:09 localhost ceph-osd[33177]: osd.3 pg_epoch: 23 pg[5.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [2,3,4] r=1 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:18 localhost sshd[56486]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:19 localhost ceph-osd[33177]: osd.3 pg_epoch: 31 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=31 pruub=8.638875961s) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 active pruub 1116.948120117s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:19 localhost ceph-osd[33177]: osd.3 pg_epoch: 31 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=31 pruub=8.636317253s) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1116.948120117s@ mbc={}] state: transitioning to Stray Feb 20 02:58:19 localhost ceph-osd[32226]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=31 pruub=10.822261810s) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 active pruub 1123.471557617s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,0], acting [5,4,0] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:19 localhost ceph-osd[32226]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=31 pruub=10.820794106s) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.471557617s@ mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.19( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.17( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1f( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1e( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1d( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1c( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1a( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1b( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.9( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.8( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.5( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.2( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.6( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.4( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.7( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.3( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.c( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.b( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.e( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.d( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.f( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.10( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.a( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.11( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.12( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.16( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.13( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.18( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.15( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.14( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.17( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.19( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1a( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1b( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.18( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.16( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.14( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.15( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.13( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.11( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.10( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.12( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.f( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.e( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.d( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1c( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.c( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.2( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.3( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.4( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.5( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.6( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.8( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.9( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.7( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.a( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.b( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1d( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1e( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:20 localhost ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1f( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:21 localhost ceph-osd[33177]: osd.3 pg_epoch: 33 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33 pruub=9.736856461s) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active pruub 1120.090332031s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,5], acting [3,4,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:21 localhost ceph-osd[33177]: osd.3 pg_epoch: 33 pg[5.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=33 pruub=12.287407875s) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 active pruub 1122.640991211s@ mbc={}] start_peering_interval up [2,3,4] -> [2,3,4], acting [2,3,4] -> [2,3,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:21 localhost ceph-osd[33177]: osd.3 pg_epoch: 33 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33 pruub=9.736856461s) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown pruub 1120.090332031s@ mbc={}] state: transitioning to Primary Feb 20 02:58:21 localhost ceph-osd[33177]: osd.3 pg_epoch: 33 pg[5.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=33 pruub=12.283823013s) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1122.640991211s@ mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1e( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1f( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1e( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.10( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.11( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.10( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.13( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.12( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.12( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.14( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.13( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.15( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.11( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.15( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.14( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.16( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.17( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.17( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.16( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.9( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.9( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.8( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.8( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.b( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.a( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.b( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.a( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.c( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.d( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.d( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.c( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.4( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.5( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.19( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.7( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.18( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1b( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1a( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.6( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.6( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.7( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.5( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.3( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.2( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.2( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.4( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.3( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.f( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.e( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.e( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1d( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1c( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1d( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1c( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1b( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.f( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1a( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.19( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.18( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1f( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.0( empty local-lis/les=33/34 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:25 localhost ceph-osd[32226]: osd.0 pg_epoch: 35 pg[6.0( empty local-lis/les=0/0 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [0,4,2] r=0 lpr=35 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:25 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.0 scrub starts Feb 20 02:58:26 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.0 scrub ok Feb 20 02:58:26 localhost ceph-osd[32226]: osd.0 pg_epoch: 36 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [0,4,2] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:27 localhost ceph-osd[33177]: osd.3 pg_epoch: 36 pg[7.0( empty local-lis/les=0/0 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [1,5,3] r=2 lpr=36 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:27 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.1a scrub starts Feb 20 02:58:28 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.1a scrub ok Feb 20 02:58:29 localhost sshd[56489]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:29 localhost sshd[56491]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:30 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.18 deep-scrub starts Feb 20 02:58:31 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.18 deep-scrub ok Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.4( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,1] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.183403969s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.183325768s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658691406s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.3( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182825089s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182825089s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.658691406s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.1f( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182967186s) [0,1,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659423828s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,5], acting [5,4,0] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182967186s) [0,1,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.659423828s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.19( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.1e( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1e( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182214737s) [3,2,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.660034180s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,4], acting [5,4,0] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630276680s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636314392s) [2,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807495117s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,1], acting [3,4,5] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630190849s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636209488s) [2,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807495117s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182172775s) [3,2,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.660034180s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.180464745s) [4,0,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658325195s@ mbc={}] start_peering_interval up [5,4,0] -> [4,0,5], acting [5,4,0] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.180406570s) [4,0,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658325195s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636122704s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807739258s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.632182121s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803833008s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178568840s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350341797s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636039734s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807739258s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.181464195s) [3,4,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659423828s@ mbc={}] start_peering_interval up [5,4,0] -> [3,4,5], acting [5,4,0] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.181440353s) [3,4,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.659423828s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631990433s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.803833008s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.5( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631323814s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803466797s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,3], acting [2,3,4] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631259918s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.803466797s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178009033s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350341797s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.7( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177729607s) [2,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350219727s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177687645s) [2,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350219727s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628846169s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.634988785s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807739258s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628745079s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.634940147s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807739258s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628884315s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801879883s@ mbc={}] start_peering_interval up [2,3,4] -> [1,0,2], acting [2,3,4] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631159782s) [4,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.804199219s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,0], acting [2,3,4] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631127357s) [4,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.804199219s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628637314s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801879883s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630361557s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803710938s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,5], acting [2,3,4] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630361557s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.803710938s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178629875s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351440430s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177393913s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350952148s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178629875s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.351440430s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627976418s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801635742s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177356720s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350952148s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627976418s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.801635742s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627916336s) [2,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801635742s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,4], acting [3,4,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.9( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176346779s) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350219727s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.629531860s) [2,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803466797s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,4], acting [2,3,4] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175090790s) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654663086s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,2], acting [5,4,0] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175090790s) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.654663086s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179287910s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658935547s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179253578s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658935547s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174358368s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654174805s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174321175s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.654174805s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178557396s) [2,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [2,0,4], acting [5,4,0] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179244995s) [5,1,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659423828s@ mbc={}] start_peering_interval up [5,4,0] -> [5,1,3], acting [5,4,0] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178516388s) [2,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658691406s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.633274078s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807373047s@ mbc={}] start_peering_interval up [3,4,5] -> [4,5,0], acting [3,4,5] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179925919s) [3,5,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.660156250s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,4], acting [5,4,0] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179894447s) [3,5,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.660156250s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.9( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176346779s) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.350219727s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179183006s) [5,1,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.659423828s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.629234314s) [2,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.803466797s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627415657s) [2,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801635742s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627312660s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801757812s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.8( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175848007s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350341797s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.b( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627437592s) [4,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.802001953s@ mbc={}] start_peering_interval up [2,3,4] -> [4,0,2], acting [2,3,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178317070s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659301758s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.8( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175795555s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350341797s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178133965s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.659301758s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.633084297s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807373047s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627393723s) [4,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.802001953s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627050400s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801757812s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.4( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.2( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176150322s) [1,0,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351684570s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.2( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176084518s) [1,0,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351684570s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176688194s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176637650s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658691406s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.4( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174972534s) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350708008s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176395416s) [3,5,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658569336s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,1], acting [5,4,0] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176329613s) [3,5,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658569336s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171910286s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654174805s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171862602s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.654174805s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171505928s) [1,2,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653930664s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,3], acting [5,4,0] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171465874s) [1,2,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653930664s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.a( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.4( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174972534s) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.350708008s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176141739s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352172852s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624894142s) [3,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801025391s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,2], acting [2,3,4] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176141739s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.352172852s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624894142s) [3,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.801025391s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.6( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174986839s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351562500s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.7( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174946785s) [4,2,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351562500s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.6( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174986839s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.351562500s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627578735s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.804443359s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170887947s) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654052734s@ mbc={}] start_peering_interval up [5,4,0] -> [2,4,0], acting [5,4,0] -> [2,4,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170839310s) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.654052734s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170619965s) [1,5,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653930664s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,0], acting [5,4,0] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.7( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174694061s) [4,2,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351562500s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170265198s) [1,5,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653808594s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,3], acting [5,4,0] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.7( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170221329s) [1,5,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653808594s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170529366s) [1,5,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653930664s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170422554s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654174805s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170422554s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.654174805s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168743134s) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652832031s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,0], acting [5,4,0] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168713570s) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652832031s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169594765s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653686523s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169507027s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653686523s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168907166s) [2,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653198242s@ mbc={}] start_peering_interval up [5,4,0] -> [2,1,0], acting [5,4,0] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168875694s) [2,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653198242s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623968124s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630991936s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808593750s@ mbc={}] start_peering_interval up [3,4,5] -> [1,5,0], acting [3,4,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.3( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174083710s) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351806641s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630887032s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808593750s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.3( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174035072s) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351806641s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623549461s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168410301s) [1,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652954102s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,5], acting [5,4,0] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168010712s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167948723s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168293953s) [1,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652954102s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167943001s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652709961s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167948723s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.652709961s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.16( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167690277s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653198242s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167144775s) [0,1,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,2], acting [5,4,0] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.2( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628471375s) [4,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807617188s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,2], acting [3,4,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628397942s) [4,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807617188s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167144775s) [0,1,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.652709961s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167296410s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653198242s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621114731s) [2,4,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800659180s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,0], acting [2,3,4] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627498627s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.804443359s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.b( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621026039s) [2,4,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800659180s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165511131s) [5,3,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [5,3,4], acting [5,4,0] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165446281s) [5,3,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652709961s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628199577s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808227539s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620414734s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800415039s@ mbc={}] start_peering_interval up [2,3,4] -> [3,4,2], acting [2,3,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620414734s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.800415039s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172441483s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352539062s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621196747s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801635742s@ mbc={}] start_peering_interval up [2,3,4] -> [1,5,0], acting [2,3,4] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.12( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621147156s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801635742s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628086090s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808227539s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.1e( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619400024s) [2,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800292969s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,1], acting [2,3,4] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618499756s) [3,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799438477s@ mbc={}] start_peering_interval up [2,3,4] -> [3,5,4], acting [2,3,4] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618499756s) [3,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.799438477s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619309425s) [2,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800292969s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172441483s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.352539062s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618299484s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799682617s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.11( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171576500s) [4,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352905273s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627405167s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.19( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172412872s) [3,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353881836s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.11( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171514511s) [4,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352905273s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627290726s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808715820s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.19( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172412872s) [3,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.353881836s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618128777s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.799682617s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.617027283s) [4,5,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798706055s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,3], acting [2,3,4] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616981506s) [4,5,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798706055s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626946449s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808837891s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626896858s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808837891s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.18( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171956062s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353759766s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616610527s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798583984s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,0], acting [2,3,4] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.18( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171730042s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353759766s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616552353s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798583984s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.17( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170897484s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353393555s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626388550s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808837891s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,2], acting [3,4,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626388550s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.808837891s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.17( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170782089s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353393555s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616309166s) [1,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798950195s@ mbc={}] start_peering_interval up [2,3,4] -> [1,2,0], acting [2,3,4] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616275787s) [1,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798950195s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.16( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170540810s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353393555s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625931740s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808837891s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.16( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170490265s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353393555s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615800858s) [5,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798828125s@ mbc={}] start_peering_interval up [2,3,4] -> [5,1,3], acting [2,3,4] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625931740s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.808837891s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615706444s) [5,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798828125s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.15( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170089722s) [5,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353271484s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625356674s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [3,1,5], acting [3,4,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.15( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170002937s) [5,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353271484s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625356674s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.808715820s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624859810s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.13( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615407944s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799316406s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,1], acting [2,3,4] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624911308s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624801636s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808715820s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624871254s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808715820s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615819931s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799804688s@ mbc={}] start_peering_interval up [2,3,4] -> [3,2,4], acting [2,3,4] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615819931s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.799804688s@ mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.14( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169187546s) [4,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353271484s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.13( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169013023s) [2,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353149414s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,3], acting [4,5,3] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.14( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169098854s) [4,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353271484s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.13( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168964386s) [2,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353149414s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624094963s) [5,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808471680s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,1], acting [3,4,5] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615510941s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799926758s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624039650s) [5,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808471680s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615470886s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.799926758s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.12( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168498993s) [5,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353149414s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.12( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168471336s) [5,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353149414s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.13( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614545822s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.799316406s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615341187s) [1,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800048828s@ mbc={}] start_peering_interval up [2,3,4] -> [1,3,2], acting [2,3,4] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623558998s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808349609s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615289688s) [1,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800048828s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.10( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168370247s) [2,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353271484s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623507500s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808349609s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.10( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168342590s) [2,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353271484s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623075485s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808227539s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623042107s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808227539s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623076439s) [5,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808349609s@ mbc={}] start_peering_interval up [3,4,5] -> [5,4,3], acting [3,4,5] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623312950s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808593750s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623011589s) [5,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808349609s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614774704s) [0,2,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800170898s@ mbc={}] start_peering_interval up [2,3,4] -> [0,2,4], acting [2,3,4] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166985512s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352416992s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614736557s) [0,2,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800170898s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.622882843s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808471680s@ mbc={}] start_peering_interval up [3,4,5] -> [0,1,5], acting [3,4,5] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.622827530s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808471680s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.622802734s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808593750s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614774704s) [5,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800415039s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,4], acting [2,3,4] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166861534s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352416992s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166196823s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352294922s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614451408s) [5,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800415039s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166126251s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352294922s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621610641s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807983398s@ mbc={}] start_peering_interval up [3,4,5] -> [1,0,2], acting [3,4,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165349007s) [5,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351684570s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165320396s) [5,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351684570s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165679932s) [2,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352172852s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165613174s) [2,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352172852s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614287376s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800903320s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621454239s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808227539s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,4], acting [3,4,5] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614025116s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800903320s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621428490s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808227539s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621025085s) [2,1,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808105469s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,0], acting [3,4,5] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621115685s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807983398s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620933533s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808105469s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620966911s) [2,1,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808105469s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614357948s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [2,3,4] -> [0,4,5], acting [2,3,4] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620886803s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808105469s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614321709s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619892120s) [0,5,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807495117s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,1], acting [3,4,5] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619792938s) [0,5,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807495117s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.613944054s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801757812s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619462013s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807373047s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.613870621s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801757812s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.5( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162936211s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350952148s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619343758s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807373047s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.5( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162478447s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350952148s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162004471s) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350708008s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.161961555s) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350708008s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.152458191s) [0,4,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.341430664s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.152385712s) [0,4,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.341430664s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614631653s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.804199219s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,5], acting [2,3,4] -> [0,1,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.617092133s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.806762695s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.617017746s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.806762695s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162965775s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353027344s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.613686562s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.804199219s@ mbc={}] state: transitioning to Stray Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.18( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:58:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162833214s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353027344s@ mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.10( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.c( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.a( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.5( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.16( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.13( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.16( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.11( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,2,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.a( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,0,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.5( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.1b( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,0,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.9( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.8( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.2( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,0,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.10( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.14( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.2( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,0,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1c( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.e( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.1c( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,2,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.d( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,2,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1d( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1a( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,3,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1b( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.9( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.b( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.1d( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.f( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.8( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.b( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.9( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.14( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.13( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.15( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.c( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,0,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.1c( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.5( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,0,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.1( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,1,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.d( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,4,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.e( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,0,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.10( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.1e( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.19( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,1,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.6( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.19( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.1f( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,1,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.4( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,1] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.5( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.7( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.b( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.16( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.12( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.12( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.17( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.1e( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.1d( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.1( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.2( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.7( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.9( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.b( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.17( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.17( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.1e( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.a( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.1( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[2.1f( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.3( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.18( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.19( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.e( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.c( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.6( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.1e( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.4( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.6( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.11( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.f( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.10( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.14( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:32 localhost ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.4( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:58:35 localhost sshd[56524]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:36 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.12 scrub starts Feb 20 02:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:58:38 localhost podman[56541]: 2026-02-20 07:58:38.129465745 +0000 UTC m=+0.070461675 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, config_id=tripleo_step1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public) Feb 20 02:58:38 localhost podman[56541]: 2026-02-20 07:58:38.321933095 +0000 UTC m=+0.262928945 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 02:58:38 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:58:41 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.17 scrub starts Feb 20 02:58:41 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.17 scrub ok Feb 20 02:58:46 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.19 scrub starts Feb 20 02:58:46 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.19 scrub ok Feb 20 02:58:47 localhost sshd[56573]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:47 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1 scrub starts Feb 20 02:58:47 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1 scrub ok Feb 20 02:58:50 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts Feb 20 02:58:50 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok Feb 20 02:58:51 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.6 scrub starts Feb 20 02:58:51 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.6 scrub ok Feb 20 02:58:52 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.1e scrub starts Feb 20 02:58:53 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.1e scrub ok Feb 20 02:58:55 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.a scrub starts Feb 20 02:58:55 localhost sshd[56575]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:58:55 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.a scrub ok Feb 20 02:58:57 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.6 scrub starts Feb 20 02:58:57 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.6 scrub ok Feb 20 02:58:58 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1f scrub starts Feb 20 02:58:59 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1f scrub ok Feb 20 02:59:00 localhost python3[56592]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:00 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.3 scrub starts Feb 20 02:59:01 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.3 scrub ok Feb 20 02:59:02 localhost python3[56608]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:04 localhost python3[56624]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:04 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.12 scrub starts Feb 20 02:59:04 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.12 scrub ok Feb 20 02:59:05 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.14 scrub starts Feb 20 02:59:05 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.14 scrub ok Feb 20 02:59:05 localhost sshd[56625]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:06 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.1e scrub starts Feb 20 02:59:07 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.1e scrub ok Feb 20 02:59:07 localhost python3[56674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:08 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.17 deep-scrub starts Feb 20 02:59:08 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.17 deep-scrub ok Feb 20 02:59:08 localhost python3[56717]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574347.6070158-92373-97076641946657/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=8e2004121a34320613d32710ae37702da8d027e6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:59:09 localhost systemd[1]: tmp-crun.OMzGSO.mount: Deactivated successfully. Feb 20 02:59:09 localhost podman[56732]: 2026-02-20 07:59:09.14872189 +0000 UTC m=+0.086170049 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, tcib_managed=true, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5) Feb 20 02:59:09 localhost podman[56732]: 2026-02-20 07:59:09.349013725 +0000 UTC m=+0.286461884 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 20 02:59:09 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:59:10 localhost sshd[56759]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:10 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.19 scrub starts Feb 20 02:59:10 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.19 scrub ok Feb 20 02:59:11 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.19 deep-scrub starts Feb 20 02:59:11 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.19 deep-scrub ok Feb 20 02:59:12 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.11 scrub starts Feb 20 02:59:12 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.11 scrub ok Feb 20 02:59:13 localhost python3[56808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:13 localhost python3[56851]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574352.7095907-92373-118933211119620/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=417007d20895a54571330144b727b714177f3d13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:14 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.b scrub starts Feb 20 02:59:14 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.b scrub ok Feb 20 02:59:15 localhost sshd[56866]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:16 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.10 scrub starts Feb 20 02:59:16 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.10 scrub ok Feb 20 02:59:16 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.5 scrub starts Feb 20 02:59:16 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.5 scrub ok Feb 20 02:59:17 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.e scrub starts Feb 20 02:59:17 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.e scrub ok Feb 20 02:59:17 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.16 deep-scrub starts Feb 20 02:59:17 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.16 deep-scrub ok Feb 20 02:59:18 localhost python3[56915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:18 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1 scrub starts Feb 20 02:59:18 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1 scrub ok Feb 20 02:59:18 localhost python3[56958]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574357.8343172-92373-95668928802339/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=2a03ad5f1837679340274b70e67e768ad4c81335 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:19 localhost sshd[56973]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:22 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.6 scrub starts Feb 20 02:59:22 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.6 scrub ok Feb 20 02:59:22 localhost python3[57022]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:22 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.7 scrub starts Feb 20 02:59:23 localhost python3[57067]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574362.3786244-92796-268830979843795/source _original_basename=tmpxxzpl50x follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:23 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.4 scrub starts Feb 20 02:59:23 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.4 scrub ok Feb 20 02:59:23 localhost ceph-osd[32226]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=43 pruub=15.158725739s) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 1191.458862305s@ mbc={}] start_peering_interval up [0,4,2] -> [0,4,2], acting [0,4,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:23 localhost ceph-osd[33177]: osd.3 pg_epoch: 43 pg[7.0( v 40'39 (0'0,40'39] local-lis/les=36/37 n=22 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=43 pruub=8.497297287s) [1,5,3] r=2 lpr=43 pi=[36,43)/1 luod=0'0 lua=38'37 crt=40'39 lcod 38'38 mlcod 0'0 active pruub 1180.491577148s@ mbc={}] start_peering_interval up [1,5,3] -> [1,5,3], acting [1,5,3] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:23 localhost ceph-osd[32226]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=43 pruub=15.158725739s) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.458862305s@ mbc={}] state: transitioning to Primary Feb 20 02:59:23 localhost ceph-osd[33177]: osd.3 pg_epoch: 43 pg[7.0( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=43 pruub=8.494708061s) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 lcod 38'38 mlcod 0'0 unknown NOTIFY pruub 1180.491577148s@ mbc={}] state: transitioning to Stray Feb 20 02:59:23 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.4 scrub starts Feb 20 02:59:24 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.4 scrub ok Feb 20 02:59:24 localhost python3[57129]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.c( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.d( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.4( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.2( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.3( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.5( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.f( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.e( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.9( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.b( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.7( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.8( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.a( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.0( empty local-lis/les=43/44 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:24 localhost python3[57172]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574363.9345176-92882-190388927568376/source _original_basename=tmpt7i83os7 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:24 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 2.1f scrub starts Feb 20 02:59:24 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 2.1f scrub ok Feb 20 02:59:25 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.f deep-scrub starts Feb 20 02:59:25 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.f deep-scrub ok Feb 20 02:59:25 localhost python3[57202]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Feb 20 02:59:25 localhost python3[57220]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:59:27 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.0 scrub starts Feb 20 02:59:27 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.0 scrub ok Feb 20 02:59:27 localhost ansible-async_wrapper.py[57392]: Invoked with 986433874107 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574366.7728791-92975-202100251770581/AnsiballZ_command.py _ Feb 20 02:59:27 localhost ansible-async_wrapper.py[57395]: Starting module and watcher Feb 20 02:59:27 localhost ansible-async_wrapper.py[57395]: Start watching 57396 (3600) Feb 20 02:59:27 localhost ansible-async_wrapper.py[57396]: Start module (57396) Feb 20 02:59:27 localhost ansible-async_wrapper.py[57392]: Return async_wrapper task started. Feb 20 02:59:27 localhost sshd[57417]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:27 localhost python3[57416]: ansible-ansible.legacy.async_status Invoked with jid=986433874107.57392 mode=status _async_dir=/tmp/.ansible_async Feb 20 02:59:28 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.9 scrub starts Feb 20 02:59:28 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.9 scrub ok Feb 20 02:59:29 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1e deep-scrub starts Feb 20 02:59:29 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1e deep-scrub ok Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.c( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.6( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.985249519s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.985187531s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040893555s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.4( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983015060s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040527344s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.982960701s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040527344s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983129501s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983056068s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040893555s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983232498s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.041137695s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983161926s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.041137695s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.982010841s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040039062s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981976509s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040039062s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981136322s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.039428711s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981098175s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.039428711s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981787682s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040039062s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981742859s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040039062s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,2,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.980316162s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040039062s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.980286598s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040039062s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970592499s) [0,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337646484s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,2], acting [0,4,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967342377s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.334350586s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970592499s) [0,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.337646484s@ mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967270851s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.334350586s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970020294s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337402344s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969986916s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337402344s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.973694801s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.341186523s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.973631859s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.341186523s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969996452s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337646484s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969961166s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337646484s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968859673s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.336669922s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968693733s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.336669922s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969968796s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337768555s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968120575s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.336059570s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969835281s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337768555s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968057632s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.336059570s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970770836s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.338867188s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970770836s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.338867188s@ mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.971567154s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.339965820s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969411850s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337890625s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970956802s) [3,1,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.339477539s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,2], acting [0,4,2] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969371796s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337890625s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.971508980s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.339965820s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970055580s) [4,0,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.338989258s@ mbc={}] start_peering_interval up [0,4,2] -> [4,0,2], acting [0,4,2] -> [4,0,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970895767s) [3,1,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.339477539s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970000267s) [4,0,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.338989258s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.972677231s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.341674805s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967591286s) [0,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.336669922s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,5], acting [0,4,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.972595215s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.341674805s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968501091s) [2,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337646484s@ mbc={}] start_peering_interval up [0,4,2] -> [2,1,3], acting [0,4,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967591286s) [0,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.336669922s@ mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968437195s) [2,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337646484s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968077660s) [3,5,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337402344s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,4], acting [0,4,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966302872s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.335571289s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966302872s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.335571289s@ mbc={}] state: transitioning to Primary Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967981339s) [3,5,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337402344s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965125084s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.334960938s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965083122s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.334960938s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965057373s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.334960938s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970343590s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.340209961s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969474792s) [5,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.339355469s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,3], acting [0,4,2] -> [5,1,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969427109s) [5,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.339355469s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970239639s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.340209961s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970639229s) [3,2,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.340698242s@ mbc={}] start_peering_interval up [0,4,2] -> [3,2,1], acting [0,4,2] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965029716s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.334960938s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970547676s) [3,2,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.340698242s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967091560s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337402344s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970276833s) [5,3,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.340698242s@ mbc={}] start_peering_interval up [0,4,2] -> [5,3,4], acting [0,4,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967032433s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337402344s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970193863s) [5,3,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.340698242s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965265274s) [5,4,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.335937500s@ mbc={}] start_peering_interval up [0,4,2] -> [5,4,0], acting [0,4,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.964879990s) [5,0,1] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.335815430s@ mbc={}] start_peering_interval up [0,4,2] -> [5,0,1], acting [0,4,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966756821s) [1,2,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337768555s@ mbc={}] start_peering_interval up [0,4,2] -> [1,2,3], acting [0,4,2] -> [1,2,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966783524s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337890625s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966718674s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337890625s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.964653969s) [5,0,1] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.335815430s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966633797s) [1,2,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337768555s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965171814s) [5,4,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.335937500s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966559410s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337890625s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965885162s) [5,1,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337524414s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,0], acting [0,4,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966259003s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337890625s@ mbc={}] state: transitioning to Stray Feb 20 02:59:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965665817s) [5,1,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337524414s@ mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.12 scrub starts Feb 20 02:59:30 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.12 scrub ok Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [2,1,3] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.2( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [4,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.e( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [4,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.19( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,2,3] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.d( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1e( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [5,1,3] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [5,3,4] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:30 localhost ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.9( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.10( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.18( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.16( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.1f( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.14( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.11( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.6( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.4( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.c( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.f( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.1d( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.b( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:30 localhost ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.13( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,2,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:31 localhost puppet-user[57407]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 02:59:31 localhost puppet-user[57407]: (file: /etc/puppet/hiera.yaml) Feb 20 02:59:31 localhost puppet-user[57407]: Warning: Undefined variable '::deploy_config_name'; Feb 20 02:59:31 localhost puppet-user[57407]: (file & line not available) Feb 20 02:59:31 localhost puppet-user[57407]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 02:59:31 localhost puppet-user[57407]: (file & line not available) Feb 20 02:59:31 localhost puppet-user[57407]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 20 02:59:31 localhost puppet-user[57407]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 20 02:59:31 localhost puppet-user[57407]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.12 seconds Feb 20 02:59:31 localhost puppet-user[57407]: Notice: Applied catalog in 0.04 seconds Feb 20 02:59:31 localhost puppet-user[57407]: Application: Feb 20 02:59:31 localhost puppet-user[57407]: Initial environment: production Feb 20 02:59:31 localhost puppet-user[57407]: Converged environment: production Feb 20 02:59:31 localhost puppet-user[57407]: Run mode: user Feb 20 02:59:31 localhost puppet-user[57407]: Changes: Feb 20 02:59:31 localhost puppet-user[57407]: Events: Feb 20 02:59:31 localhost puppet-user[57407]: Resources: Feb 20 02:59:31 localhost puppet-user[57407]: Total: 10 Feb 20 02:59:31 localhost puppet-user[57407]: Time: Feb 20 02:59:31 localhost puppet-user[57407]: Schedule: 0.00 Feb 20 02:59:31 localhost puppet-user[57407]: File: 0.00 Feb 20 02:59:31 localhost puppet-user[57407]: Exec: 0.01 Feb 20 02:59:31 localhost puppet-user[57407]: Augeas: 0.01 Feb 20 02:59:31 localhost puppet-user[57407]: Transaction evaluation: 0.03 Feb 20 02:59:31 localhost puppet-user[57407]: Catalog application: 0.04 Feb 20 02:59:31 localhost puppet-user[57407]: Config retrieval: 0.20 Feb 20 02:59:31 localhost puppet-user[57407]: Last run: 1771574371 Feb 20 02:59:31 localhost puppet-user[57407]: Filebucket: 0.00 Feb 20 02:59:31 localhost puppet-user[57407]: Total: 0.04 Feb 20 02:59:31 localhost puppet-user[57407]: Version: Feb 20 02:59:31 localhost puppet-user[57407]: Config: 1771574371 Feb 20 02:59:31 localhost puppet-user[57407]: Puppet: 7.10.0 Feb 20 02:59:31 localhost ansible-async_wrapper.py[57396]: Module complete (57396) Feb 20 02:59:32 localhost ansible-async_wrapper.py[57395]: Done in kid B. Feb 20 02:59:33 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.7 deep-scrub starts Feb 20 02:59:34 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub starts Feb 20 02:59:34 localhost sshd[57530]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:35 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.7 deep-scrub ok Feb 20 02:59:35 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.9 scrub starts Feb 20 02:59:35 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.17 scrub starts Feb 20 02:59:35 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.9 scrub ok Feb 20 02:59:37 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.b scrub starts Feb 20 02:59:37 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.b scrub ok Feb 20 02:59:37 localhost python3[57662]: ansible-ansible.legacy.async_status Invoked with jid=986433874107.57392 mode=status _async_dir=/tmp/.ansible_async Feb 20 02:59:38 localhost python3[57694]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:59:39 localhost python3[57710]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:59:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 02:59:39 localhost podman[57761]: 2026-02-20 07:59:39.476338519 +0000 UTC m=+0.083390313 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=metrics_qdr, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 02:59:39 localhost python3[57760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800383568s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800305367s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800383568s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.040893555s@ mbc={}] state: transitioning to Primary Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800305367s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.040893555s@ mbc={}] state: transitioning to Primary Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800362587s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800362587s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.040893555s@ mbc={}] state: transitioning to Primary Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.798828125s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.039794922s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:39 localhost ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.798828125s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.039794922s@ mbc={}] state: transitioning to Primary Feb 20 02:59:39 localhost podman[57761]: 2026-02-20 07:59:39.667851685 +0000 UTC m=+0.274903449 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 02:59:39 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 02:59:39 localhost python3[57806]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp9uhuy3f7 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 02:59:40 localhost sshd[57956]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:40 localhost python3[57955]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:40 localhost ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:40 localhost ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:40 localhost ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:40 localhost ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:41 localhost python3[58117]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.798413277s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.087524414s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.798413277s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.087524414s@ mbc={}] state: transitioning to Primary Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797753334s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.086791992s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.802041054s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.091308594s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797753334s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.086791992s@ mbc={}] state: transitioning to Primary Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.802041054s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.091308594s@ mbc={}] state: transitioning to Primary Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797941208s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.087768555s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:41 localhost ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797941208s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.087768555s@ mbc={}] state: transitioning to Primary Feb 20 02:59:41 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:59:41 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 02:59:42 localhost python3[58139]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:42 localhost ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:42 localhost ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:42 localhost ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=49/50 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:42 localhost ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 02:59:43 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.2 scrub starts Feb 20 02:59:43 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.10 scrub starts Feb 20 02:59:43 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.2 scrub ok Feb 20 02:59:43 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.10 scrub ok Feb 20 02:59:43 localhost python3[58171]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 02:59:43 localhost python3[58221]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:44 localhost python3[58239]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:44 localhost python3[58301]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:44 localhost python3[58319]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:45 localhost python3[58381]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:45 localhost python3[58399]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:46 localhost python3[58461]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:46 localhost python3[58479]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:46 localhost python3[58509]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:59:46 localhost systemd[1]: Reloading. Feb 20 02:59:47 localhost systemd-sysv-generator[58539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:59:47 localhost systemd-rc-local-generator[58534]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:59:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 02:59:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4236 writes, 19K keys, 4236 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4236 writes, 358 syncs, 11.83 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 978 writes, 3451 keys, 978 commit groups, 1.0 writes per commit group, ingest: 1.47 MB, 0.00 MB/s#012Interval WAL: 978 writes, 213 syncs, 4.59 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Feb 20 02:59:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:59:47 localhost python3[58595]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:47 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.4 scrub starts Feb 20 02:59:48 localhost python3[58613]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:48 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.4 scrub ok Feb 20 02:59:48 localhost python3[58675]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 02:59:48 localhost python3[58693]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:48 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.1e scrub starts Feb 20 02:59:49 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.1e scrub ok Feb 20 02:59:49 localhost python3[58723]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 02:59:49 localhost systemd[1]: Reloading. Feb 20 02:59:49 localhost systemd-sysv-generator[58752]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 02:59:49 localhost systemd-rc-local-generator[58748]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 02:59:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 02:59:49 localhost systemd[1]: Starting Create netns directory... Feb 20 02:59:49 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 02:59:49 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 02:59:49 localhost systemd[1]: Finished Create netns directory. Feb 20 02:59:49 localhost ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.598536491s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1213.040771484s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:49 localhost ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.598453522s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1213.040771484s@ mbc={}] state: transitioning to Stray Feb 20 02:59:49 localhost ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.597406387s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1213.040161133s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:49 localhost ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.597302437s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1213.040161133s@ mbc={}] state: transitioning to Stray Feb 20 02:59:49 localhost ceph-osd[32226]: osd.0 pg_epoch: 51 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:49 localhost ceph-osd[32226]: osd.0 pg_epoch: 51 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:50 localhost python3[58782]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 20 02:59:50 localhost ceph-osd[32226]: osd.0 pg_epoch: 52 pg[7.4( v 40'39 lc 38'15 (0'0,40'39] local-lis/les=51/52 n=4 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(1+2)=4}}] state: react AllReplicasActivated Activating complete Feb 20 02:59:50 localhost ceph-osd[32226]: osd.0 pg_epoch: 52 pg[7.c( v 40'39 lc 38'17 (0'0,40'39] local-lis/les=51/52 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Feb 20 02:59:51 localhost sshd[58824]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 02:59:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4907 writes, 22K keys, 4907 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4907 writes, 480 syncs, 10.22 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1520 writes, 5414 keys, 1520 commit groups, 1.0 writes per commit group, ingest: 2.10 MB, 0.00 MB/s#012Interval WAL: 1520 writes, 282 syncs, 5.39 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 m Feb 20 02:59:51 localhost python3[58840]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 20 02:59:51 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.18 scrub starts Feb 20 02:59:52 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.18 scrub ok Feb 20 02:59:52 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.7 scrub starts Feb 20 02:59:52 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.7 scrub ok Feb 20 02:59:52 localhost podman[58911]: 2026-02-20 07:59:52.271755511 +0000 UTC m=+0.086071216 container create 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute_init_log, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 02:59:52 localhost podman[58917]: 2026-02-20 07:59:52.290279211 +0000 UTC m=+0.089876773 container create 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step2, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, container_name=nova_virtqemud_init_logs) Feb 20 02:59:52 localhost podman[58911]: 2026-02-20 07:59:52.22128577 +0000 UTC m=+0.035601555 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 02:59:52 localhost systemd[1]: Started libpod-conmon-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706.scope. Feb 20 02:59:52 localhost systemd[1]: Started libpod-conmon-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb.scope. Feb 20 02:59:52 localhost podman[58917]: 2026-02-20 07:59:52.231494424 +0000 UTC m=+0.031092016 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 02:59:52 localhost systemd[1]: Started libcrun container. Feb 20 02:59:52 localhost systemd[1]: Started libcrun container. Feb 20 02:59:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 20 02:59:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Feb 20 02:59:52 localhost podman[58911]: 2026-02-20 07:59:52.362704055 +0000 UTC m=+0.177019780 container init 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step2, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 20 02:59:52 localhost podman[58911]: 2026-02-20 07:59:52.372761895 +0000 UTC m=+0.187077620 container start 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute_init_log, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 20 02:59:52 localhost python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Feb 20 02:59:52 localhost systemd[1]: libpod-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706.scope: Deactivated successfully. Feb 20 02:59:52 localhost podman[58917]: 2026-02-20 07:59:52.412388563 +0000 UTC m=+0.211986125 container init 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, tcib_managed=true, container_name=nova_virtqemud_init_logs, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=) Feb 20 02:59:52 localhost podman[58917]: 2026-02-20 07:59:52.42498127 +0000 UTC m=+0.224578832 container start 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_id=tripleo_step2, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc.) Feb 20 02:59:52 localhost python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Feb 20 02:59:52 localhost systemd[1]: libpod-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb.scope: Deactivated successfully. Feb 20 02:59:52 localhost podman[58949]: 2026-02-20 07:59:52.465833925 +0000 UTC m=+0.076298136 container died 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13) Feb 20 02:59:52 localhost podman[58973]: 2026-02-20 07:59:52.492901797 +0000 UTC m=+0.049775761 container died 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, container_name=nova_virtqemud_init_logs, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}) Feb 20 02:59:52 localhost podman[58949]: 2026-02-20 07:59:52.59809722 +0000 UTC m=+0.208561381 container cleanup 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute_init_log, distribution-scope=public, vcs-type=git) Feb 20 02:59:52 localhost systemd[1]: libpod-conmon-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706.scope: Deactivated successfully. Feb 20 02:59:52 localhost podman[58979]: 2026-02-20 07:59:52.618061844 +0000 UTC m=+0.167034695 container cleanup 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_virtqemud_init_logs, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 02:59:52 localhost systemd[1]: libpod-conmon-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb.scope: Deactivated successfully. Feb 20 02:59:53 localhost podman[59101]: 2026-02-20 07:59:53.022176783 +0000 UTC m=+0.078846824 container create 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git) Feb 20 02:59:53 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.16 deep-scrub starts Feb 20 02:59:53 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.16 deep-scrub ok Feb 20 02:59:53 localhost podman[59102]: 2026-02-20 07:59:53.057779667 +0000 UTC m=+0.106358970 container create f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, tcib_managed=true, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 20 02:59:53 localhost systemd[1]: Started libpod-conmon-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope. Feb 20 02:59:53 localhost podman[59101]: 2026-02-20 07:59:52.978011405 +0000 UTC m=+0.034681456 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 02:59:53 localhost systemd[1]: Started libcrun container. Feb 20 02:59:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 20 02:59:53 localhost systemd[1]: Started libpod-conmon-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope. Feb 20 02:59:53 localhost podman[59102]: 2026-02-20 07:59:52.994074419 +0000 UTC m=+0.042653732 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 20 02:59:53 localhost podman[59101]: 2026-02-20 07:59:53.097938391 +0000 UTC m=+0.154608422 container init 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13) Feb 20 02:59:53 localhost systemd[1]: Started libcrun container. Feb 20 02:59:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 02:59:53 localhost podman[59101]: 2026-02-20 07:59:53.109715403 +0000 UTC m=+0.166385434 container start 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step2, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=create_virtlogd_wrapper, batch=17.1_20260112.1) Feb 20 02:59:53 localhost podman[59101]: 2026-02-20 07:59:53.110082884 +0000 UTC m=+0.166752975 container attach 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_virtlogd_wrapper, tcib_managed=true, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step2, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T23:31:49Z) Feb 20 02:59:53 localhost podman[59102]: 2026-02-20 07:59:53.118933075 +0000 UTC m=+0.167512358 container init f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 20 02:59:53 localhost podman[59102]: 2026-02-20 07:59:53.128075277 +0000 UTC m=+0.176654560 container start f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=create_haproxy_wrapper, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20260112.1) Feb 20 02:59:53 localhost podman[59102]: 2026-02-20 07:59:53.128547031 +0000 UTC m=+0.177126314 container attach f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 20 02:59:53 localhost systemd[1]: var-lib-containers-storage-overlay-0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865-merged.mount: Deactivated successfully. Feb 20 02:59:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb-userdata-shm.mount: Deactivated successfully. Feb 20 02:59:53 localhost systemd[1]: var-lib-containers-storage-overlay-63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e-merged.mount: Deactivated successfully. Feb 20 02:59:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706-userdata-shm.mount: Deactivated successfully. Feb 20 02:59:54 localhost ovs-vsctl[59203]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Feb 20 02:59:54 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.18 scrub starts Feb 20 02:59:54 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.18 scrub ok Feb 20 02:59:55 localhost systemd[1]: libpod-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope: Deactivated successfully. Feb 20 02:59:55 localhost systemd[1]: libpod-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope: Consumed 2.082s CPU time. Feb 20 02:59:55 localhost podman[59101]: 2026-02-20 07:59:55.186747573 +0000 UTC m=+2.243417604 container died 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, container_name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 02:59:55 localhost systemd[1]: tmp-crun.LNP1Ln.mount: Deactivated successfully. Feb 20 02:59:55 localhost podman[59353]: 2026-02-20 07:59:55.263077399 +0000 UTC m=+0.063902585 container cleanup 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, config_id=tripleo_step2, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 20 02:59:55 localhost systemd[1]: libpod-conmon-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope: Deactivated successfully. Feb 20 02:59:55 localhost python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Feb 20 02:59:55 localhost ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.464720726s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1219.088256836s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:55 localhost ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.464027405s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1219.087646484s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:55 localhost ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.464625359s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1219.088256836s@ mbc={}] state: transitioning to Stray Feb 20 02:59:55 localhost ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.463951111s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1219.087646484s@ mbc={}] state: transitioning to Stray Feb 20 02:59:56 localhost systemd[1]: var-lib-containers-storage-overlay-d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de-merged.mount: Deactivated successfully. Feb 20 02:59:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452-userdata-shm.mount: Deactivated successfully. Feb 20 02:59:56 localhost systemd[1]: libpod-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope: Deactivated successfully. Feb 20 02:59:56 localhost systemd[1]: libpod-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope: Consumed 2.170s CPU time. Feb 20 02:59:56 localhost podman[59397]: 2026-02-20 07:59:56.870133475 +0000 UTC m=+0.049372957 container died f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=create_haproxy_wrapper, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true) Feb 20 02:59:56 localhost systemd[1]: tmp-crun.TEbLEa.mount: Deactivated successfully. Feb 20 02:59:56 localhost podman[59397]: 2026-02-20 07:59:56.907618098 +0000 UTC m=+0.086857550 container cleanup f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 02:59:56 localhost systemd[1]: libpod-conmon-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope: Deactivated successfully. Feb 20 02:59:56 localhost python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Feb 20 02:59:57 localhost ceph-osd[32226]: osd.0 pg_epoch: 53 pg[7.d( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53) [2,0,4] r=1 lpr=53 pi=[45,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:57 localhost ceph-osd[32226]: osd.0 pg_epoch: 53 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53) [2,0,4] r=1 lpr=53 pi=[45,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 02:59:57 localhost systemd[1]: var-lib-containers-storage-overlay-2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb-merged.mount: Deactivated successfully. Feb 20 02:59:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124-userdata-shm.mount: Deactivated successfully. Feb 20 02:59:57 localhost python3[59449]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 02:59:58 localhost ceph-osd[32226]: osd.0 pg_epoch: 55 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:58 localhost ceph-osd[32226]: osd.0 pg_epoch: 55 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 02:59:58 localhost ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583439827s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1221.292724609s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:58 localhost ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583265305s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1221.292602539s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 02:59:58 localhost ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583332062s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.292724609s@ mbc={}] state: transitioning to Stray Feb 20 02:59:58 localhost ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583180428s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.292602539s@ mbc={}] state: transitioning to Stray Feb 20 02:59:58 localhost sshd[59498]: main: sshd: ssh-rsa algorithm is disabled Feb 20 02:59:59 localhost ceph-osd[32226]: osd.0 pg_epoch: 56 pg[7.e( v 40'39 lc 38'19 (0'0,40'39] local-lis/les=55/56 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Feb 20 02:59:59 localhost ceph-osd[32226]: osd.0 pg_epoch: 56 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=55/56 n=2 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Feb 20 02:59:59 localhost python3[59571]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005625204 step=2 update_config_hash_only=False Feb 20 02:59:59 localhost python3[59588]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:00:00 localhost ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600752831s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1223.322021484s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:00 localhost ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600689888s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.322021484s@ mbc={}] state: transitioning to Stray Feb 20 03:00:00 localhost ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600103378s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1223.322143555s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:00 localhost ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600008965s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.322143555s@ mbc={}] state: transitioning to Stray Feb 20 03:00:00 localhost python3[59604]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 20 03:00:01 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.b scrub starts Feb 20 03:00:01 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.b scrub ok Feb 20 03:00:02 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.6 scrub starts Feb 20 03:00:02 localhost ceph-osd[33177]: osd.3 pg_epoch: 59 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=10.256752968s) [3,4,5] r=0 lpr=59 pi=[43,59)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1221.042114258s@ mbc={}] start_peering_interval up [1,5,3] -> [3,4,5], acting [1,5,3] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:02 localhost ceph-osd[33177]: osd.3 pg_epoch: 59 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=10.256752968s) [3,4,5] r=0 lpr=59 pi=[43,59)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1221.042114258s@ mbc={}] state: transitioning to Primary Feb 20 03:00:02 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.6 scrub ok Feb 20 03:00:03 localhost ceph-osd[33177]: osd.3 pg_epoch: 60 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=59/60 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=59) [3,4,5] r=0 lpr=59 pi=[43,59)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 03:00:03 localhost sshd[59605]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:06 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.e deep-scrub starts Feb 20 03:00:06 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.13 scrub starts Feb 20 03:00:06 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.e deep-scrub ok Feb 20 03:00:06 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.13 scrub ok Feb 20 03:00:07 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1f scrub starts Feb 20 03:00:07 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1f scrub ok Feb 20 03:00:07 localhost sshd[59607]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:09 localhost ceph-osd[32226]: osd.0 pg_epoch: 61 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61) [0,2,4] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 03:00:09 localhost ceph-osd[33177]: osd.3 pg_epoch: 61 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=8.968554497s) [0,2,4] r=-1 lpr=61 pi=[45,61)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1227.091918945s@ mbc={}] start_peering_interval up [4,2,3] -> [0,2,4], acting [4,2,3] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:09 localhost ceph-osd[33177]: osd.3 pg_epoch: 61 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=8.968462944s) [0,2,4] r=-1 lpr=61 pi=[45,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1227.091918945s@ mbc={}] state: transitioning to Stray Feb 20 03:00:10 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.c scrub starts Feb 20 03:00:10 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.c scrub ok Feb 20 03:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:00:10 localhost podman[59609]: 2026-02-20 08:00:10.163666197 +0000 UTC m=+0.088706158 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1766032510, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 20 03:00:10 localhost podman[59609]: 2026-02-20 08:00:10.377945861 +0000 UTC m=+0.302985832 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:00:10 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:00:10 localhost ceph-osd[32226]: osd.0 pg_epoch: 62 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=61/62 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61) [0,2,4] r=0 lpr=61 pi=[45,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 20 03:00:11 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.6 scrub starts Feb 20 03:00:11 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.6 scrub ok Feb 20 03:00:12 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.9 scrub starts Feb 20 03:00:12 localhost ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.9 scrub ok Feb 20 03:00:12 localhost sshd[59639]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:15 localhost ceph-osd[33177]: osd.3 pg_epoch: 63 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=63 pruub=13.070129395s) [2,0,4] r=-1 lpr=63 pi=[47,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1237.293090820s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:15 localhost ceph-osd[33177]: osd.3 pg_epoch: 63 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=63 pruub=13.069807053s) [2,0,4] r=-1 lpr=63 pi=[47,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1237.293090820s@ mbc={}] state: transitioning to Stray Feb 20 03:00:16 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.f scrub starts Feb 20 03:00:16 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.f scrub ok Feb 20 03:00:16 localhost ceph-osd[32226]: osd.0 pg_epoch: 63 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=63) [2,0,4] r=1 lpr=63 pi=[47,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 03:00:17 localhost ceph-osd[33177]: osd.3 pg_epoch: 65 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=13.054226875s) [3,1,2] r=0 lpr=65 pi=[49,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1239.326904297s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,2], acting [3,4,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:17 localhost ceph-osd[33177]: osd.3 pg_epoch: 65 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=13.054226875s) [3,1,2] r=0 lpr=65 pi=[49,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1239.326904297s@ mbc={}] state: transitioning to Primary Feb 20 03:00:18 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.11 scrub starts Feb 20 03:00:18 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.11 scrub ok Feb 20 03:00:18 localhost sshd[59641]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:19 localhost ceph-osd[33177]: osd.3 pg_epoch: 66 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=65/66 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=65) [3,1,2] r=0 lpr=65 pi=[49,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Feb 20 03:00:19 localhost ceph-osd[32226]: osd.0 pg_epoch: 67 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=51/52 n=1 ec=43/36 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=11.157955170s) [1,3,2] r=-1 lpr=67 pi=[51,67)/1 crt=40'39 mlcod 0'0 active pruub 1243.806396484s@ mbc={255={}}] start_peering_interval up [0,1,2] -> [1,3,2], acting [0,1,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:19 localhost ceph-osd[32226]: osd.0 pg_epoch: 67 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=51/52 n=1 ec=43/36 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=11.157869339s) [1,3,2] r=-1 lpr=67 pi=[51,67)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1243.806396484s@ mbc={}] state: transitioning to Stray Feb 20 03:00:20 localhost ceph-osd[33177]: osd.3 pg_epoch: 67 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=51/51 les/c/f=52/52/0 sis=67) [1,3,2] r=1 lpr=67 pi=[51,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 03:00:21 localhost ceph-osd[32226]: osd.0 pg_epoch: 69 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=43/36 lis/c=53/53 les/c/f=54/54/0 sis=69 pruub=15.295172691s) [1,3,5] r=-1 lpr=69 pi=[53,69)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1249.991577148s@ mbc={}] start_peering_interval up [2,0,4] -> [1,3,5], acting [2,0,4] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:21 localhost ceph-osd[32226]: osd.0 pg_epoch: 69 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=43/36 lis/c=53/53 les/c/f=54/54/0 sis=69 pruub=15.295073509s) [1,3,5] r=-1 lpr=69 pi=[53,69)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1249.991577148s@ mbc={}] state: transitioning to Stray Feb 20 03:00:22 localhost ceph-osd[33177]: osd.3 pg_epoch: 69 pg[7.d( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=53/53 les/c/f=54/54/0 sis=69) [1,3,5] r=1 lpr=69 pi=[53,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 20 03:00:23 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.4 scrub starts Feb 20 03:00:23 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.4 scrub ok Feb 20 03:00:24 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.14 scrub starts Feb 20 03:00:24 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.14 scrub ok Feb 20 03:00:25 localhost sshd[59643]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:26 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1d scrub starts Feb 20 03:00:26 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1d scrub ok Feb 20 03:00:28 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.2 scrub starts Feb 20 03:00:28 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.2 scrub ok Feb 20 03:00:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 71 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71 pruub=9.173424721s) [3,5,1] r=-1 lpr=71 pi=[55,71)/1 crt=40'39 mlcod 0'0 active pruub 1252.020019531s@ mbc={255={}}] start_peering_interval up [0,4,5] -> [3,5,1], acting [0,4,5] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:29 localhost ceph-osd[32226]: osd.0 pg_epoch: 71 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71 pruub=9.173316956s) [3,5,1] r=-1 lpr=71 pi=[55,71)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1252.020019531s@ mbc={}] state: transitioning to Stray Feb 20 03:00:29 localhost ceph-osd[33177]: osd.3 pg_epoch: 71 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71) [3,5,1] r=0 lpr=71 pi=[55,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 03:00:30 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.3 scrub starts Feb 20 03:00:30 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.3 scrub ok Feb 20 03:00:31 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.8 scrub starts Feb 20 03:00:31 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.8 scrub ok Feb 20 03:00:31 localhost sshd[59645]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 72 pg[7.e( v 40'39 lc 38'19 (0'0,40'39] local-lis/les=71/72 n=1 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71) [3,5,1] r=0 lpr=71 pi=[55,71)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Feb 20 03:00:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 73 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73 pruub=9.211073875s) [0,5,1] r=-1 lpr=73 pi=[57,73)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1249.801635742s@ mbc={}] start_peering_interval up [1,5,3] -> [0,5,1], acting [1,5,3] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 20 03:00:31 localhost ceph-osd[33177]: osd.3 pg_epoch: 73 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73 pruub=9.210969925s) [0,5,1] r=-1 lpr=73 pi=[57,73)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1249.801635742s@ mbc={}] state: transitioning to Stray Feb 20 03:00:31 localhost ceph-osd[32226]: osd.0 pg_epoch: 73 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73) [0,5,1] r=0 lpr=73 pi=[57,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 20 03:00:32 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.b scrub starts Feb 20 03:00:32 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.b scrub ok Feb 20 03:00:33 localhost ceph-osd[32226]: osd.0 pg_epoch: 74 pg[7.f( v 40'39 lc 38'1 (0'0,40'39] local-lis/les=73/74 n=3 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73) [0,5,1] r=0 lpr=73 pi=[57,73)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(2+1)=3}}] state: react AllReplicasActivated Activating complete Feb 20 03:00:33 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub starts Feb 20 03:00:36 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.17 scrub starts Feb 20 03:00:36 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.17 scrub ok Feb 20 03:00:37 localhost sshd[59647]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:38 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub starts Feb 20 03:00:38 localhost ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub ok Feb 20 03:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:00:41 localhost podman[59725]: 2026-02-20 08:00:41.129898205 +0000 UTC m=+0.068987331 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, release=1766032510) Feb 20 03:00:41 localhost podman[59725]: 2026-02-20 08:00:41.353390703 +0000 UTC m=+0.292479879 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, version=17.1.13, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 20 03:00:41 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:00:46 localhost sshd[59754]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:52 localhost sshd[59756]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:54 localhost sshd[59758]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:00:58 localhost sshd[59760]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:05 localhost sshd[59773]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:07 localhost sshd[59775]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:09 localhost sshd[59777]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:01:12 localhost podman[59779]: 2026-02-20 08:01:12.147854851 +0000 UTC m=+0.087136372 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:01:12 localhost podman[59779]: 2026-02-20 08:01:12.358179262 +0000 UTC m=+0.297460723 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:01:12 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:01:14 localhost sshd[59808]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:22 localhost sshd[59810]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:29 localhost sshd[59812]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:35 localhost sshd[59814]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:40 localhost sshd[59929]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:40 localhost systemd[1]: tmp-crun.dq5Nqp.mount: Deactivated successfully. Feb 20 03:01:40 localhost podman[59919]: 2026-02-20 08:01:40.368821187 +0000 UTC m=+0.090244654 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1770267347, build-date=2026-02-09T10:25:24Z) Feb 20 03:01:40 localhost podman[59919]: 2026-02-20 08:01:40.472019785 +0000 UTC m=+0.193443312 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1770267347, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 03:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:01:43 localhost sshd[60073]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:43 localhost podman[60062]: 2026-02-20 08:01:43.134791317 +0000 UTC m=+0.074741675 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, url=https://www.redhat.com) Feb 20 03:01:43 localhost podman[60062]: 2026-02-20 08:01:43.329180006 +0000 UTC m=+0.269130374 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 20 03:01:43 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:01:49 localhost sshd[60093]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:01:55 localhost sshd[60095]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:03 localhost sshd[60097]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:09 localhost sshd[60099]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:02:14 localhost podman[60101]: 2026-02-20 08:02:14.133849401 +0000 UTC m=+0.077437905 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:02:14 localhost podman[60101]: 2026-02-20 08:02:14.315485242 +0000 UTC m=+0.259073806 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:02:14 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:02:23 localhost sshd[60131]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:27 localhost sshd[60133]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:29 localhost sshd[60135]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:42 localhost sshd[60184]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:02:45 localhost systemd[1]: tmp-crun.r3yGUB.mount: Deactivated successfully. Feb 20 03:02:45 localhost podman[60215]: 2026-02-20 08:02:45.151597698 +0000 UTC m=+0.089029909 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git) Feb 20 03:02:45 localhost podman[60215]: 2026-02-20 08:02:45.359168937 +0000 UTC m=+0.296601188 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:02:45 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:02:49 localhost sshd[60244]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:53 localhost sshd[60246]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:02:57 localhost sshd[60248]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:02 localhost sshd[60250]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:07 localhost sshd[60252]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:10 localhost sshd[60254]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:12 localhost sshd[60256]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:03:16 localhost podman[60258]: 2026-02-20 08:03:16.146547229 +0000 UTC m=+0.084453905 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, architecture=x86_64) Feb 20 03:03:16 localhost podman[60258]: 2026-02-20 08:03:16.377362145 +0000 UTC m=+0.315268811 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:03:16 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:03:17 localhost sshd[60288]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:23 localhost sshd[60290]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:26 localhost sshd[60292]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:30 localhost sshd[60294]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:33 localhost sshd[60296]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:38 localhost sshd[60298]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:41 localhost sshd[60300]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:44 localhost sshd[60364]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:03:47 localhost podman[60381]: 2026-02-20 08:03:47.102051912 +0000 UTC m=+0.066450835 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:03:47 localhost podman[60381]: 2026-02-20 08:03:47.287103392 +0000 UTC m=+0.251502335 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:03:47 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:03:49 localhost sshd[60410]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:53 localhost sshd[60412]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:58 localhost sshd[60414]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:03:58 localhost sshd[60416]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:06 localhost sshd[60418]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:10 localhost sshd[60420]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:13 localhost sshd[60422]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:04:18 localhost systemd[1]: tmp-crun.DQMc8y.mount: Deactivated successfully. Feb 20 03:04:18 localhost podman[60424]: 2026-02-20 08:04:18.147476457 +0000 UTC m=+0.083791555 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:04:18 localhost podman[60424]: 2026-02-20 08:04:18.341932694 +0000 UTC m=+0.278247822 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:04:18 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:04:18 localhost sshd[60452]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:23 localhost sshd[60454]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:28 localhost sshd[60456]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:30 localhost python3[60505]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:31 localhost python3[60550]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574670.3824532-99243-6894457537755/source _original_basename=tmp1yp4vlhb follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:31 localhost sshd[60565]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:32 localhost python3[60581]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:04:33 localhost sshd[60650]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:34 localhost ansible-async_wrapper.py[60755]: Invoked with 303677600785 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574673.6000972-99471-144000516931635/AnsiballZ_command.py _ Feb 20 03:04:34 localhost ansible-async_wrapper.py[60758]: Starting module and watcher Feb 20 03:04:34 localhost ansible-async_wrapper.py[60758]: Start watching 60759 (3600) Feb 20 03:04:34 localhost ansible-async_wrapper.py[60759]: Start module (60759) Feb 20 03:04:34 localhost ansible-async_wrapper.py[60755]: Return async_wrapper task started. Feb 20 03:04:34 localhost python3[60779]: ansible-ansible.legacy.async_status Invoked with jid=303677600785.60755 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:04:36 localhost sshd[60820]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:37 localhost puppet-user[60763]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 03:04:37 localhost puppet-user[60763]: (file: /etc/puppet/hiera.yaml) Feb 20 03:04:37 localhost puppet-user[60763]: Warning: Undefined variable '::deploy_config_name'; Feb 20 03:04:37 localhost puppet-user[60763]: (file & line not available) Feb 20 03:04:37 localhost puppet-user[60763]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 03:04:37 localhost puppet-user[60763]: (file & line not available) Feb 20 03:04:37 localhost puppet-user[60763]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 20 03:04:37 localhost sshd[60892]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:37 localhost puppet-user[60763]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 20 03:04:37 localhost puppet-user[60763]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.12 seconds Feb 20 03:04:37 localhost puppet-user[60763]: Notice: Applied catalog in 0.03 seconds Feb 20 03:04:37 localhost puppet-user[60763]: Application: Feb 20 03:04:37 localhost puppet-user[60763]: Initial environment: production Feb 20 03:04:37 localhost puppet-user[60763]: Converged environment: production Feb 20 03:04:37 localhost puppet-user[60763]: Run mode: user Feb 20 03:04:37 localhost puppet-user[60763]: Changes: Feb 20 03:04:37 localhost puppet-user[60763]: Events: Feb 20 03:04:37 localhost puppet-user[60763]: Resources: Feb 20 03:04:37 localhost puppet-user[60763]: Total: 10 Feb 20 03:04:37 localhost puppet-user[60763]: Time: Feb 20 03:04:37 localhost puppet-user[60763]: Schedule: 0.00 Feb 20 03:04:37 localhost puppet-user[60763]: File: 0.00 Feb 20 03:04:37 localhost puppet-user[60763]: Exec: 0.00 Feb 20 03:04:37 localhost puppet-user[60763]: Augeas: 0.01 Feb 20 03:04:37 localhost puppet-user[60763]: Transaction evaluation: 0.03 Feb 20 03:04:37 localhost puppet-user[60763]: Catalog application: 0.03 Feb 20 03:04:37 localhost puppet-user[60763]: Config retrieval: 0.15 Feb 20 03:04:37 localhost puppet-user[60763]: Last run: 1771574677 Feb 20 03:04:37 localhost puppet-user[60763]: Filebucket: 0.00 Feb 20 03:04:37 localhost puppet-user[60763]: Total: 0.04 Feb 20 03:04:37 localhost puppet-user[60763]: Version: Feb 20 03:04:37 localhost puppet-user[60763]: Config: 1771574677 Feb 20 03:04:37 localhost puppet-user[60763]: Puppet: 7.10.0 Feb 20 03:04:37 localhost ansible-async_wrapper.py[60759]: Module complete (60759) Feb 20 03:04:39 localhost ansible-async_wrapper.py[60758]: Done in kid B. Feb 20 03:04:41 localhost sshd[60895]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:43 localhost sshd[60897]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:44 localhost python3[60914]: ansible-ansible.legacy.async_status Invoked with jid=303677600785.60755 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:04:45 localhost sshd[60975]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:45 localhost python3[60967]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 03:04:45 localhost sshd[60982]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:46 localhost python3[61010]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:04:46 localhost python3[61060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:46 localhost python3[61093]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpu3k5w8dk recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 03:04:47 localhost python3[61124]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:04:48 localhost podman[61227]: 2026-02-20 08:04:48.592411763 +0000 UTC m=+0.091101957 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:04:48 localhost python3[61228]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 20 03:04:48 localhost podman[61227]: 2026-02-20 08:04:48.84201349 +0000 UTC m=+0.340703664 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:04:48 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:04:49 localhost python3[61274]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:50 localhost python3[61307]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:04:51 localhost sshd[61358]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:51 localhost python3[61357]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:51 localhost python3[61376]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:52 localhost python3[61438]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:52 localhost python3[61456]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:52 localhost python3[61518]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:53 localhost python3[61537]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:53 localhost python3[61599]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:54 localhost python3[61617]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:54 localhost python3[61647]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:04:54 localhost systemd[1]: Reloading. Feb 20 03:04:54 localhost systemd-rc-local-generator[61671]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:04:54 localhost systemd-sysv-generator[61674]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:04:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:04:55 localhost sshd[61724]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:55 localhost python3[61733]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:55 localhost python3[61751]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:56 localhost python3[61814]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:04:56 localhost python3[61832]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:04:57 localhost python3[61862]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:04:57 localhost systemd[1]: Reloading. Feb 20 03:04:57 localhost systemd-rc-local-generator[61888]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:04:57 localhost systemd-sysv-generator[61892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:04:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:04:57 localhost sshd[61899]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:04:57 localhost systemd[1]: Starting Create netns directory... Feb 20 03:04:57 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 03:04:57 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 03:04:57 localhost systemd[1]: Finished Create netns directory. Feb 20 03:04:58 localhost python3[61920]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 20 03:05:00 localhost python3[61979]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 20 03:05:00 localhost podman[62141]: 2026-02-20 08:05:00.565794354 +0000 UTC m=+0.082875177 container create 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.) Feb 20 03:05:00 localhost podman[62147]: 2026-02-20 08:05:00.596490689 +0000 UTC m=+0.100038510 container create d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, architecture=x86_64) Feb 20 03:05:00 localhost systemd[1]: Started libpod-conmon-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope. Feb 20 03:05:00 localhost podman[62150]: 2026-02-20 08:05:00.620399798 +0000 UTC m=+0.102289719 container create ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, container_name=rsyslog, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:05:00 localhost podman[62141]: 2026-02-20 08:05:00.521783703 +0000 UTC m=+0.038864626 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 20 03:05:00 localhost podman[62147]: 2026-02-20 08:05:00.538864033 +0000 UTC m=+0.042411874 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 20 03:05:00 localhost podman[62182]: 2026-02-20 08:05:00.645957097 +0000 UTC m=+0.107395884 container create e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:05:00 localhost systemd[1]: Started libcrun container. Feb 20 03:05:00 localhost systemd[1]: Started libpod-conmon-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6.scope. Feb 20 03:05:00 localhost systemd[1]: Started libpod-conmon-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope. Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7/merged/scripts supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost podman[62150]: 2026-02-20 08:05:00.56206894 +0000 UTC m=+0.043958881 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 20 03:05:00 localhost podman[62182]: 2026-02-20 08:05:00.569203357 +0000 UTC m=+0.030642154 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:00 localhost systemd[1]: Started libcrun container. Feb 20 03:05:00 localhost systemd[1]: Started libcrun container. Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost systemd[1]: Started libpod-conmon-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19.scope. Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b52b75a7380249fd6beb40dca6e23a5c2c2b3650de6523e005db6f52b5fe90d0/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost systemd[1]: Started libcrun container. Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost podman[62150]: 2026-02-20 08:05:00.697205179 +0000 UTC m=+0.179095080 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:05:00 localhost podman[62182]: 2026-02-20 08:05:00.701330595 +0000 UTC m=+0.162769382 container init e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, container_name=nova_virtlogd_wrapper, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com) Feb 20 03:05:00 localhost podman[62141]: 2026-02-20 08:05:00.701964074 +0000 UTC m=+0.219044907 container init 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:05:00 localhost podman[62150]: 2026-02-20 08:05:00.70609546 +0000 UTC m=+0.187985351 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64) Feb 20 03:05:00 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=eb8c5e608f55bc52c95871f92a543185 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 20 03:05:00 localhost podman[62182]: 2026-02-20 08:05:00.710869545 +0000 UTC m=+0.172308332 container start e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, container_name=nova_virtlogd_wrapper, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git) Feb 20 03:05:00 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:00 localhost podman[62168]: 2026-02-20 08:05:00.720478358 +0000 UTC m=+0.198437788 container create a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, distribution-scope=public) Feb 20 03:05:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:05:00 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:00 localhost podman[62141]: 2026-02-20 08:05:00.742287863 +0000 UTC m=+0.259368686 container start 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, version=17.1.13, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=collectd, managed_by=tripleo_ansible) Feb 20 03:05:00 localhost podman[62147]: 2026-02-20 08:05:00.747275315 +0000 UTC m=+0.250823156 container init d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_init_log, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Feb 20 03:05:00 localhost systemd[1]: Created slice User Slice of UID 0. Feb 20 03:05:00 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d31718fcd17fdeee6489534105191c7a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 20 03:05:00 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 20 03:05:00 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:00 localhost systemd[1]: Started libpod-conmon-a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0.scope. Feb 20 03:05:00 localhost podman[62147]: 2026-02-20 08:05:00.763023155 +0000 UTC m=+0.266570966 container start d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64) Feb 20 03:05:00 localhost podman[62168]: 2026-02-20 08:05:00.666446971 +0000 UTC m=+0.144406411 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:05:00 localhost systemd[1]: libpod-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6.scope: Deactivated successfully. Feb 20 03:05:00 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 20 03:05:00 localhost systemd[1]: Starting User Manager for UID 0... Feb 20 03:05:00 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Feb 20 03:05:00 localhost systemd[1]: Started libcrun container. Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:00 localhost systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:00 localhost podman[62168]: 2026-02-20 08:05:00.80387546 +0000 UTC m=+0.281834900 container init a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13) Feb 20 03:05:00 localhost podman[62168]: 2026-02-20 08:05:00.8219208 +0000 UTC m=+0.299880230 container start a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, container_name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:05:00 localhost podman[62168]: 2026-02-20 08:05:00.822149067 +0000 UTC m=+0.300108517 container attach a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_statedir_owner, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public) Feb 20 03:05:00 localhost systemd[1]: libpod-a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0.scope: Deactivated successfully. Feb 20 03:05:00 localhost podman[62168]: 2026-02-20 08:05:00.871201642 +0000 UTC m=+0.349161122 container died a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, distribution-scope=public, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 20 03:05:00 localhost podman[62254]: 2026-02-20 08:05:00.867065786 +0000 UTC m=+0.118761111 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team) Feb 20 03:05:00 localhost podman[62307]: 2026-02-20 08:05:00.904734274 +0000 UTC m=+0.094471300 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5) Feb 20 03:05:00 localhost systemd[62274]: Queued start job for default target Main User Target. Feb 20 03:05:00 localhost systemd[62274]: Created slice User Application Slice. Feb 20 03:05:00 localhost systemd[62274]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 20 03:05:00 localhost systemd[62274]: Started Daily Cleanup of User's Temporary Directories. Feb 20 03:05:00 localhost systemd[62274]: Reached target Paths. Feb 20 03:05:00 localhost systemd[62274]: Reached target Timers. Feb 20 03:05:00 localhost systemd[62274]: Starting D-Bus User Message Bus Socket... Feb 20 03:05:00 localhost systemd[62274]: Starting Create User's Volatile Files and Directories... Feb 20 03:05:00 localhost systemd[62274]: Listening on D-Bus User Message Bus Socket. Feb 20 03:05:00 localhost systemd[62274]: Finished Create User's Volatile Files and Directories. Feb 20 03:05:00 localhost systemd[62274]: Reached target Sockets. Feb 20 03:05:00 localhost systemd[62274]: Reached target Basic System. Feb 20 03:05:00 localhost systemd[62274]: Reached target Main User Target. Feb 20 03:05:00 localhost systemd[62274]: Startup finished in 114ms. Feb 20 03:05:00 localhost systemd[1]: Started User Manager for UID 0. Feb 20 03:05:00 localhost podman[62307]: 2026-02-20 08:05:00.932784519 +0000 UTC m=+0.122521525 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1) Feb 20 03:05:00 localhost systemd[1]: Started Session c1 of User root. Feb 20 03:05:00 localhost systemd[1]: Started Session c2 of User root. Feb 20 03:05:00 localhost systemd[1]: libpod-conmon-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:00 localhost podman[62281]: 2026-02-20 08:05:00.953408408 +0000 UTC m=+0.153190920 container died d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, url=https://www.redhat.com, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:05:00 localhost podman[62254]: 2026-02-20 08:05:00.953950704 +0000 UTC m=+0.205646019 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, release=1766032510, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 20 03:05:01 localhost podman[62271]: 2026-02-20 08:05:01.003466923 +0000 UTC m=+0.215676545 container cleanup d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step3, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_init_log, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13) Feb 20 03:05:01 localhost systemd[1]: libpod-conmon-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6.scope: Deactivated successfully. Feb 20 03:05:01 localhost systemd[1]: session-c1.scope: Deactivated successfully. Feb 20 03:05:01 localhost podman[62254]: unhealthy Feb 20 03:05:01 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:01 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Failed with result 'exit-code'. Feb 20 03:05:01 localhost systemd[1]: session-c2.scope: Deactivated successfully. Feb 20 03:05:01 localhost podman[62356]: 2026-02-20 08:05:01.044320258 +0000 UTC m=+0.162503604 container cleanup a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 20 03:05:01 localhost systemd[1]: libpod-conmon-a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0.scope: Deactivated successfully. Feb 20 03:05:01 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Feb 20 03:05:01 localhost podman[62523]: 2026-02-20 08:05:01.425159175 +0000 UTC m=+0.090372735 container create 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:05:01 localhost systemd[1]: Started libpod-conmon-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def.scope. Feb 20 03:05:01 localhost systemd[1]: Started libcrun container. Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost podman[62523]: 2026-02-20 08:05:01.388792077 +0000 UTC m=+0.054005657 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:01 localhost podman[62523]: 2026-02-20 08:05:01.494331504 +0000 UTC m=+0.159545074 container init 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-libvirt-container) Feb 20 03:05:01 localhost podman[62523]: 2026-02-20 08:05:01.504031039 +0000 UTC m=+0.169244569 container start 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 20 03:05:01 localhost podman[62555]: 2026-02-20 08:05:01.509145345 +0000 UTC m=+0.077572515 container create c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt) Feb 20 03:05:01 localhost systemd[1]: Started libpod-conmon-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2.scope. Feb 20 03:05:01 localhost podman[62555]: 2026-02-20 08:05:01.470330202 +0000 UTC m=+0.038757432 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:01 localhost systemd[1]: var-lib-containers-storage-overlay-b52b75a7380249fd6beb40dca6e23a5c2c2b3650de6523e005db6f52b5fe90d0-merged.mount: Deactivated successfully. Feb 20 03:05:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6-userdata-shm.mount: Deactivated successfully. Feb 20 03:05:01 localhost systemd[1]: Started libcrun container. Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:01 localhost podman[62555]: 2026-02-20 08:05:01.630442462 +0000 UTC m=+0.198869622 container init c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 20 03:05:01 localhost podman[62555]: 2026-02-20 08:05:01.636157996 +0000 UTC m=+0.204585156 container start c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtsecretd, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true) Feb 20 03:05:01 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:01 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:01 localhost systemd[1]: Started Session c3 of User root. Feb 20 03:05:01 localhost systemd[1]: session-c3.scope: Deactivated successfully. Feb 20 03:05:02 localhost podman[62703]: 2026-02-20 08:05:02.029302409 +0000 UTC m=+0.079302779 container create b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5) Feb 20 03:05:02 localhost podman[62711]: 2026-02-20 08:05:02.062720607 +0000 UTC m=+0.100382461 container create 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Feb 20 03:05:02 localhost podman[62703]: 2026-02-20 08:05:01.985775901 +0000 UTC m=+0.035776341 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:02 localhost systemd[1]: Started libpod-conmon-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope. Feb 20 03:05:02 localhost systemd[1]: Started libpod-conmon-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope. Feb 20 03:05:02 localhost podman[62711]: 2026-02-20 08:05:01.995328163 +0000 UTC m=+0.032990037 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 20 03:05:02 localhost systemd[1]: Started libcrun container. Feb 20 03:05:02 localhost systemd[1]: Started libcrun container. Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d/merged/etc/target supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:05:02 localhost podman[62711]: 2026-02-20 08:05:02.14453508 +0000 UTC m=+0.182196944 container init 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, release=1766032510, vcs-type=git, container_name=iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 20 03:05:02 localhost podman[62703]: 2026-02-20 08:05:02.173787091 +0000 UTC m=+0.223787481 container init b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, container_name=nova_virtnodedevd, release=1766032510, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc.) Feb 20 03:05:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:05:02 localhost podman[62703]: 2026-02-20 08:05:02.184093656 +0000 UTC m=+0.234094086 container start b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc.) Feb 20 03:05:02 localhost podman[62711]: 2026-02-20 08:05:02.186496109 +0000 UTC m=+0.224157983 container start 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:05:02 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:02 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:02 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=24eefedeb2e4ab8bab62979b617bbba7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 20 03:05:02 localhost systemd[1]: Started Session c4 of User root. Feb 20 03:05:02 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:02 localhost systemd[1]: Started Session c5 of User root. Feb 20 03:05:02 localhost podman[62742]: 2026-02-20 08:05:02.295209743 +0000 UTC m=+0.096798882 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 20 03:05:02 localhost systemd[1]: session-c4.scope: Deactivated successfully. Feb 20 03:05:02 localhost podman[62742]: 2026-02-20 08:05:02.30990075 +0000 UTC m=+0.111489859 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true) Feb 20 03:05:02 localhost podman[62742]: unhealthy Feb 20 03:05:02 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:02 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed with result 'exit-code'. Feb 20 03:05:02 localhost kernel: Loading iSCSI transport class v2.0-870. Feb 20 03:05:02 localhost systemd[1]: session-c5.scope: Deactivated successfully. Feb 20 03:05:02 localhost sshd[62886]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:02 localhost podman[62880]: 2026-02-20 08:05:02.737895055 +0000 UTC m=+0.095303526 container create 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible) Feb 20 03:05:02 localhost systemd[1]: Started libpod-conmon-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108.scope. Feb 20 03:05:02 localhost podman[62880]: 2026-02-20 08:05:02.694391318 +0000 UTC m=+0.051799819 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:02 localhost systemd[1]: Started libcrun container. Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:02 localhost podman[62880]: 2026-02-20 08:05:02.817428288 +0000 UTC m=+0.174836759 container init 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, container_name=nova_virtstoraged, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com) Feb 20 03:05:02 localhost podman[62880]: 2026-02-20 08:05:02.827895508 +0000 UTC m=+0.185303989 container start 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., container_name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:05:02 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:02 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:02 localhost systemd[1]: Started Session c6 of User root. Feb 20 03:05:02 localhost systemd[1]: session-c6.scope: Deactivated successfully. Feb 20 03:05:03 localhost podman[62987]: 2026-02-20 08:05:03.288851026 +0000 UTC m=+0.089149718 container create 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, architecture=x86_64, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtqemud, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:05:03 localhost systemd[1]: Started libpod-conmon-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope. Feb 20 03:05:03 localhost podman[62987]: 2026-02-20 08:05:03.246588038 +0000 UTC m=+0.046886760 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:03 localhost systemd[1]: Started libcrun container. Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost podman[62987]: 2026-02-20 08:05:03.362724538 +0000 UTC m=+0.163023240 container init 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_virtqemud, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=) Feb 20 03:05:03 localhost podman[62987]: 2026-02-20 08:05:03.372516636 +0000 UTC m=+0.172815328 container start 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_virtqemud, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:05:03 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:03 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:03 localhost systemd[1]: Started Session c7 of User root. Feb 20 03:05:03 localhost systemd[1]: session-c7.scope: Deactivated successfully. Feb 20 03:05:03 localhost podman[63090]: 2026-02-20 08:05:03.803925834 +0000 UTC m=+0.073677286 container create e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtproxyd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 20 03:05:03 localhost systemd[1]: Started libpod-conmon-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34.scope. Feb 20 03:05:03 localhost systemd[1]: Started libcrun container. Feb 20 03:05:03 localhost podman[63090]: 2026-02-20 08:05:03.763831173 +0000 UTC m=+0.033582615 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:03 localhost podman[63090]: 2026-02-20 08:05:03.875374043 +0000 UTC m=+0.145125485 container init e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, container_name=nova_virtproxyd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:05:03 localhost podman[63090]: 2026-02-20 08:05:03.884195581 +0000 UTC m=+0.153947013 container start e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, tcib_managed=true, config_id=tripleo_step3, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 20 03:05:03 localhost python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:05:03 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:05:03 localhost systemd[1]: Started Session c8 of User root. Feb 20 03:05:04 localhost systemd[1]: session-c8.scope: Deactivated successfully. Feb 20 03:05:04 localhost python3[63172]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:04 localhost python3[63188]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:05 localhost python3[63204]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:05 localhost python3[63220]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:05 localhost python3[63236]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:05 localhost python3[63252]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:06 localhost python3[63268]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:06 localhost python3[63284]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:06 localhost python3[63300]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:06 localhost python3[63316]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:07 localhost python3[63332]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:07 localhost python3[63348]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:07 localhost python3[63364]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:07 localhost python3[63380]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:08 localhost python3[63396]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:08 localhost python3[63412]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:08 localhost sshd[63429]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:08 localhost python3[63428]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:08 localhost python3[63445]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:05:09 localhost python3[63508]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:09 localhost python3[63537]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:10 localhost sshd[63567]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:10 localhost python3[63566]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:10 localhost python3[63597]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:11 localhost python3[63626]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:11 localhost python3[63655]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:12 localhost python3[63684]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:12 localhost python3[63713]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:13 localhost python3[63742]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:13 localhost python3[63758]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 03:05:13 localhost systemd[1]: Reloading. Feb 20 03:05:13 localhost systemd-rc-local-generator[63784]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:13 localhost systemd-sysv-generator[63787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:14 localhost systemd[1]: Stopping User Manager for UID 0... Feb 20 03:05:14 localhost systemd[62274]: Activating special unit Exit the Session... Feb 20 03:05:14 localhost systemd[62274]: Stopped target Main User Target. Feb 20 03:05:14 localhost systemd[62274]: Stopped target Basic System. Feb 20 03:05:14 localhost systemd[62274]: Stopped target Paths. Feb 20 03:05:14 localhost systemd[62274]: Stopped target Sockets. Feb 20 03:05:14 localhost systemd[62274]: Stopped target Timers. Feb 20 03:05:14 localhost systemd[62274]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 03:05:14 localhost systemd[62274]: Closed D-Bus User Message Bus Socket. Feb 20 03:05:14 localhost systemd[62274]: Stopped Create User's Volatile Files and Directories. Feb 20 03:05:14 localhost systemd[62274]: Removed slice User Application Slice. Feb 20 03:05:14 localhost systemd[62274]: Reached target Shutdown. Feb 20 03:05:14 localhost systemd[62274]: Finished Exit the Session. Feb 20 03:05:14 localhost systemd[62274]: Reached target Exit the Session. Feb 20 03:05:14 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 20 03:05:14 localhost systemd[1]: Stopped User Manager for UID 0. Feb 20 03:05:14 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 20 03:05:14 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 20 03:05:14 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 20 03:05:14 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 20 03:05:14 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 20 03:05:14 localhost python3[63810]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:14 localhost systemd[1]: Reloading. Feb 20 03:05:14 localhost systemd-sysv-generator[63843]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:14 localhost systemd-rc-local-generator[63840]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:15 localhost systemd[1]: Starting collectd container... Feb 20 03:05:15 localhost systemd[1]: Started collectd container. Feb 20 03:05:15 localhost python3[63877]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:15 localhost systemd[1]: Reloading. Feb 20 03:05:15 localhost systemd-rc-local-generator[63906]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:15 localhost systemd-sysv-generator[63909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:15 localhost sshd[63915]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:16 localhost systemd[1]: Starting iscsid container... Feb 20 03:05:16 localhost systemd[1]: Started iscsid container. Feb 20 03:05:16 localhost python3[63943]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:17 localhost systemd[1]: Reloading. Feb 20 03:05:17 localhost systemd-rc-local-generator[63973]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:17 localhost systemd-sysv-generator[63976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:18 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Feb 20 03:05:18 localhost systemd[1]: Started nova_virtlogd_wrapper container. Feb 20 03:05:18 localhost python3[64012]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:18 localhost systemd[1]: Reloading. Feb 20 03:05:18 localhost systemd-sysv-generator[64040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:18 localhost systemd-rc-local-generator[64035]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:05:19 localhost systemd[1]: Starting nova_virtnodedevd container... Feb 20 03:05:19 localhost podman[64051]: 2026-02-20 08:05:19.237012316 +0000 UTC m=+0.084071724 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 20 03:05:19 localhost tripleo-start-podman-container[64052]: Creating additional drop-in dependency for "nova_virtnodedevd" (b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1) Feb 20 03:05:19 localhost systemd[1]: Reloading. Feb 20 03:05:19 localhost podman[64051]: 2026-02-20 08:05:19.459108121 +0000 UTC m=+0.306167489 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1) Feb 20 03:05:19 localhost systemd-rc-local-generator[64133]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:19 localhost systemd-sysv-generator[64139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:19 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:05:19 localhost systemd[1]: Started nova_virtnodedevd container. Feb 20 03:05:20 localhost python3[64165]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:21 localhost sshd[64168]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:21 localhost systemd[1]: Reloading. Feb 20 03:05:21 localhost systemd-sysv-generator[64200]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:21 localhost systemd-rc-local-generator[64193]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:21 localhost systemd[1]: Starting nova_virtproxyd container... Feb 20 03:05:21 localhost tripleo-start-podman-container[64206]: Creating additional drop-in dependency for "nova_virtproxyd" (e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34) Feb 20 03:05:21 localhost systemd[1]: Reloading. Feb 20 03:05:21 localhost systemd-sysv-generator[64271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:21 localhost systemd-rc-local-generator[64267]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:22 localhost systemd[1]: Started nova_virtproxyd container. Feb 20 03:05:22 localhost python3[64293]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:22 localhost systemd[1]: Reloading. Feb 20 03:05:22 localhost systemd-sysv-generator[64325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:22 localhost systemd-rc-local-generator[64322]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:23 localhost systemd[1]: Starting nova_virtqemud container... Feb 20 03:05:23 localhost tripleo-start-podman-container[64333]: Creating additional drop-in dependency for "nova_virtqemud" (0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69) Feb 20 03:05:23 localhost systemd[1]: Reloading. Feb 20 03:05:23 localhost systemd-rc-local-generator[64390]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:23 localhost systemd-sysv-generator[64394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:23 localhost systemd[1]: Started nova_virtqemud container. Feb 20 03:05:24 localhost python3[64417]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:24 localhost systemd[1]: Reloading. Feb 20 03:05:24 localhost systemd-rc-local-generator[64443]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:24 localhost systemd-sysv-generator[64451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:24 localhost systemd[1]: Starting nova_virtsecretd container... Feb 20 03:05:24 localhost tripleo-start-podman-container[64458]: Creating additional drop-in dependency for "nova_virtsecretd" (c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2) Feb 20 03:05:24 localhost systemd[1]: Reloading. Feb 20 03:05:24 localhost systemd-sysv-generator[64518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:24 localhost systemd-rc-local-generator[64513]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:25 localhost systemd[1]: Started nova_virtsecretd container. Feb 20 03:05:25 localhost python3[64542]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:25 localhost systemd[1]: Reloading. Feb 20 03:05:25 localhost systemd-sysv-generator[64572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:25 localhost systemd-rc-local-generator[64569]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:25 localhost systemd[1]: Starting nova_virtstoraged container... Feb 20 03:05:26 localhost tripleo-start-podman-container[64582]: Creating additional drop-in dependency for "nova_virtstoraged" (025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108) Feb 20 03:05:26 localhost systemd[1]: Reloading. Feb 20 03:05:26 localhost systemd-rc-local-generator[64639]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:26 localhost systemd-sysv-generator[64642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:26 localhost systemd[1]: Started nova_virtstoraged container. Feb 20 03:05:27 localhost python3[64667]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:05:27 localhost systemd[1]: Reloading. Feb 20 03:05:27 localhost systemd-rc-local-generator[64695]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:05:27 localhost systemd-sysv-generator[64698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:05:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:05:27 localhost systemd[1]: Starting rsyslog container... Feb 20 03:05:27 localhost systemd[1]: Started libcrun container. Feb 20 03:05:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:27 localhost podman[64707]: 2026-02-20 08:05:27.561216621 +0000 UTC m=+0.133423875 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:05:27 localhost podman[64707]: 2026-02-20 08:05:27.571392793 +0000 UTC m=+0.143600037 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3) Feb 20 03:05:27 localhost podman[64707]: rsyslog Feb 20 03:05:27 localhost systemd[1]: Started rsyslog container. Feb 20 03:05:27 localhost systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:27 localhost podman[64744]: 2026-02-20 08:05:27.743228821 +0000 UTC m=+0.054299099 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1766032510, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 20 03:05:27 localhost podman[64744]: 2026-02-20 08:05:27.770954616 +0000 UTC m=+0.082024894 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:05:27 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:27 localhost podman[64759]: 2026-02-20 08:05:27.862045258 +0000 UTC m=+0.066954004 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, version=17.1.13, release=1766032510, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 20 03:05:27 localhost podman[64759]: rsyslog Feb 20 03:05:27 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 20 03:05:28 localhost python3[64786]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:28 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Feb 20 03:05:28 localhost systemd[1]: Stopped rsyslog container. Feb 20 03:05:28 localhost systemd[1]: Starting rsyslog container... Feb 20 03:05:28 localhost systemd[1]: Started libcrun container. Feb 20 03:05:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:28 localhost podman[64788]: 2026-02-20 08:05:28.171879355 +0000 UTC m=+0.102966467 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=rsyslog, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, config_id=tripleo_step3) Feb 20 03:05:28 localhost podman[64788]: 2026-02-20 08:05:28.180377528 +0000 UTC m=+0.111464650 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:05:28 localhost podman[64788]: rsyslog Feb 20 03:05:28 localhost systemd[1]: Started rsyslog container. Feb 20 03:05:28 localhost systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:28 localhost podman[64811]: 2026-02-20 08:05:28.358304616 +0000 UTC m=+0.059047090 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, container_name=rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public) Feb 20 03:05:28 localhost podman[64811]: 2026-02-20 08:05:28.38192486 +0000 UTC m=+0.082667294 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64) Feb 20 03:05:28 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:28 localhost podman[64846]: 2026-02-20 08:05:28.466940391 +0000 UTC m=+0.056715790 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public) Feb 20 03:05:28 localhost podman[64846]: rsyslog Feb 20 03:05:28 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 20 03:05:28 localhost systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully. Feb 20 03:05:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5-userdata-shm.mount: Deactivated successfully. Feb 20 03:05:28 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Feb 20 03:05:28 localhost systemd[1]: Stopped rsyslog container. Feb 20 03:05:28 localhost systemd[1]: Starting rsyslog container... Feb 20 03:05:28 localhost systemd[1]: tmp-crun.KrCevN.mount: Deactivated successfully. Feb 20 03:05:28 localhost systemd[1]: Started libcrun container. Feb 20 03:05:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:28 localhost podman[64911]: 2026-02-20 08:05:28.956304363 +0000 UTC m=+0.130160107 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, container_name=rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog) Feb 20 03:05:28 localhost podman[64911]: 2026-02-20 08:05:28.968481805 +0000 UTC m=+0.142337559 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Feb 20 03:05:28 localhost podman[64911]: rsyslog Feb 20 03:05:28 localhost systemd[1]: Started rsyslog container. Feb 20 03:05:29 localhost systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:29 localhost podman[64950]: 2026-02-20 08:05:29.140170067 +0000 UTC m=+0.055153752 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1766032510, build-date=2026-01-12T22:10:09Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc.) Feb 20 03:05:29 localhost podman[64950]: 2026-02-20 08:05:29.162948526 +0000 UTC m=+0.077932151 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2026-01-12T22:10:09Z, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com) Feb 20 03:05:29 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:29 localhost podman[64978]: 2026-02-20 08:05:29.253781081 +0000 UTC m=+0.055924346 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, build-date=2026-01-12T22:10:09Z, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com) Feb 20 03:05:29 localhost podman[64978]: rsyslog Feb 20 03:05:29 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 20 03:05:29 localhost python3[65004]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005625204 step=3 update_config_hash_only=False Feb 20 03:05:29 localhost systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully. Feb 20 03:05:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5-userdata-shm.mount: Deactivated successfully. Feb 20 03:05:29 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Feb 20 03:05:29 localhost systemd[1]: Stopped rsyslog container. Feb 20 03:05:29 localhost systemd[1]: Starting rsyslog container... Feb 20 03:05:29 localhost systemd[1]: Started libcrun container. Feb 20 03:05:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:29 localhost podman[65005]: 2026-02-20 08:05:29.659991847 +0000 UTC m=+0.108411550 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:09Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5) Feb 20 03:05:29 localhost podman[65005]: 2026-02-20 08:05:29.670183131 +0000 UTC m=+0.118602804 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, container_name=rsyslog, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:05:29 localhost podman[65005]: rsyslog Feb 20 03:05:29 localhost systemd[1]: Started rsyslog container. Feb 20 03:05:29 localhost sshd[65025]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:29 localhost systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:29 localhost podman[65031]: 2026-02-20 08:05:29.830305709 +0000 UTC m=+0.051615659 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, vcs-type=git, container_name=rsyslog, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, vendor=Red Hat, Inc.) Feb 20 03:05:29 localhost podman[65031]: 2026-02-20 08:05:29.85789729 +0000 UTC m=+0.079207210 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:05:29 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:29 localhost podman[65043]: 2026-02-20 08:05:29.94286992 +0000 UTC m=+0.051352280 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, vendor=Red Hat, Inc.) Feb 20 03:05:29 localhost podman[65043]: rsyslog Feb 20 03:05:29 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 20 03:05:30 localhost sshd[65070]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:30 localhost python3[65069]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:05:30 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Feb 20 03:05:30 localhost systemd[1]: Stopped rsyslog container. Feb 20 03:05:30 localhost systemd[1]: Starting rsyslog container... Feb 20 03:05:30 localhost systemd[1]: Started libcrun container. Feb 20 03:05:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 20 03:05:30 localhost podman[65088]: 2026-02-20 08:05:30.386721847 +0000 UTC m=+0.110981405 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-rsyslog-container, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, architecture=x86_64) Feb 20 03:05:30 localhost podman[65088]: 2026-02-20 08:05:30.394870239 +0000 UTC m=+0.119129797 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:05:30 localhost podman[65088]: rsyslog Feb 20 03:05:30 localhost systemd[1]: Started rsyslog container. Feb 20 03:05:30 localhost python3[65087]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 20 03:05:30 localhost systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully. Feb 20 03:05:30 localhost systemd[1]: tmp-crun.r9jCUM.mount: Deactivated successfully. Feb 20 03:05:30 localhost podman[65109]: 2026-02-20 08:05:30.559894943 +0000 UTC m=+0.050101113 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-rsyslog, container_name=rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, distribution-scope=public) Feb 20 03:05:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5-userdata-shm.mount: Deactivated successfully. Feb 20 03:05:30 localhost systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully. Feb 20 03:05:30 localhost podman[65109]: 2026-02-20 08:05:30.583194187 +0000 UTC m=+0.073400337 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3) Feb 20 03:05:30 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:05:30 localhost podman[65121]: 2026-02-20 08:05:30.671303381 +0000 UTC m=+0.058174663 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, vcs-type=git) Feb 20 03:05:30 localhost podman[65121]: rsyslog Feb 20 03:05:30 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 20 03:05:30 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Feb 20 03:05:30 localhost systemd[1]: Stopped rsyslog container. Feb 20 03:05:30 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Feb 20 03:05:30 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 20 03:05:30 localhost systemd[1]: Failed to start rsyslog container. Feb 20 03:05:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:05:31 localhost podman[65132]: 2026-02-20 08:05:31.145037408 +0000 UTC m=+0.086043194 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=collectd, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z) Feb 20 03:05:31 localhost podman[65132]: 2026-02-20 08:05:31.156082017 +0000 UTC m=+0.097087823 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:05:31 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:05:32 localhost sshd[65153]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:05:33 localhost podman[65155]: 2026-02-20 08:05:33.135801859 +0000 UTC m=+0.075863751 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:05:33 localhost podman[65155]: 2026-02-20 08:05:33.146375953 +0000 UTC m=+0.086437905 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, com.redhat.component=openstack-iscsid-container) Feb 20 03:05:33 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:05:37 localhost sshd[65174]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:42 localhost sshd[65176]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:46 localhost sshd[65208]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:48 localhost sshd[65256]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:05:50 localhost podman[65258]: 2026-02-20 08:05:50.017595803 +0000 UTC m=+0.090105784 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, config_id=tripleo_step1, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:05:50 localhost podman[65258]: 2026-02-20 08:05:50.258159577 +0000 UTC m=+0.330669558 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:05:50 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:05:51 localhost sshd[65287]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:56 localhost sshd[65289]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:05:59 localhost sshd[65291]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:06:01 localhost systemd[1]: tmp-crun.VwWk1N.mount: Deactivated successfully. Feb 20 03:06:01 localhost podman[65293]: 2026-02-20 08:06:01.687795422 +0000 UTC m=+0.097032381 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 20 03:06:01 localhost podman[65293]: 2026-02-20 08:06:01.723566276 +0000 UTC m=+0.132803175 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64) Feb 20 03:06:01 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:06:02 localhost sshd[65315]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:06:04 localhost podman[65317]: 2026-02-20 08:06:04.136416196 +0000 UTC m=+0.075570042 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 20 03:06:04 localhost podman[65317]: 2026-02-20 08:06:04.148079713 +0000 UTC m=+0.087233629 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 20 03:06:04 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:06:07 localhost sshd[65335]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:14 localhost sshd[65337]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:16 localhost sshd[65339]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:19 localhost sshd[65341]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:06:21 localhost podman[65343]: 2026-02-20 08:06:21.145275395 +0000 UTC m=+0.084280851 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 20 03:06:21 localhost podman[65343]: 2026-02-20 08:06:21.379596932 +0000 UTC m=+0.318602348 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 20 03:06:21 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:06:22 localhost sshd[65371]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:24 localhost sshd[65373]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:29 localhost sshd[65375]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:06:32 localhost podman[65377]: 2026-02-20 08:06:32.124169837 +0000 UTC m=+0.068176451 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=collectd, version=17.1.13) Feb 20 03:06:32 localhost podman[65377]: 2026-02-20 08:06:32.131684881 +0000 UTC m=+0.075691475 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13) Feb 20 03:06:32 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:06:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:06:35 localhost podman[65397]: 2026-02-20 08:06:35.116207822 +0000 UTC m=+0.062495632 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, url=https://www.redhat.com, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:06:35 localhost podman[65397]: 2026-02-20 08:06:35.149016879 +0000 UTC m=+0.095304669 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container) Feb 20 03:06:35 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:06:38 localhost sshd[65417]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:42 localhost sshd[65419]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:46 localhost sshd[65421]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:06:51 localhost podman[65499]: 2026-02-20 08:06:51.662003594 +0000 UTC m=+0.088648031 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:06:51 localhost podman[65499]: 2026-02-20 08:06:51.864243617 +0000 UTC m=+0.290888024 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 20 03:06:51 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:06:51 localhost sshd[65529]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:55 localhost sshd[65531]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:06:58 localhost sshd[65533]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:07:02 localhost podman[65535]: 2026-02-20 08:07:02.373473222 +0000 UTC m=+0.086049412 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 20 03:07:02 localhost podman[65535]: 2026-02-20 08:07:02.38984652 +0000 UTC m=+0.102422680 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 20 03:07:02 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:07:02 localhost sshd[65555]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:03 localhost sshd[65557]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:07:05 localhost systemd[1]: tmp-crun.TVxCk8.mount: Deactivated successfully. Feb 20 03:07:05 localhost podman[65559]: 2026-02-20 08:07:05.458391064 +0000 UTC m=+0.098936288 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:07:05 localhost podman[65559]: 2026-02-20 08:07:05.468252507 +0000 UTC m=+0.108797721 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, url=https://www.redhat.com) Feb 20 03:07:05 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:07:05 localhost sshd[65578]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:05 localhost sshd[65579]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:13 localhost sshd[65581]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:17 localhost sshd[65583]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:20 localhost sshd[65585]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:07:22 localhost podman[65587]: 2026-02-20 08:07:22.1279396 +0000 UTC m=+0.072461399 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 20 03:07:22 localhost podman[65587]: 2026-02-20 08:07:22.308069314 +0000 UTC m=+0.252591163 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:07:22 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:07:24 localhost sshd[65616]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:28 localhost sshd[65618]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:28 localhost sshd[65620]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:07:32 localhost podman[65622]: 2026-02-20 08:07:32.634840823 +0000 UTC m=+0.077898164 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:07:32 localhost podman[65622]: 2026-02-20 08:07:32.648005788 +0000 UTC m=+0.091063139 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:07:32 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:07:33 localhost sshd[65643]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:07:35 localhost podman[65645]: 2026-02-20 08:07:35.935192735 +0000 UTC m=+0.079491773 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc.) Feb 20 03:07:35 localhost podman[65645]: 2026-02-20 08:07:35.942895022 +0000 UTC m=+0.087194040 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, architecture=x86_64) Feb 20 03:07:35 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:07:37 localhost sshd[65664]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:43 localhost sshd[65666]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:47 localhost sshd[65668]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:50 localhost sshd[65717]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:52 localhost sshd[65733]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:07:53 localhost systemd[1]: tmp-crun.r2aXiz.mount: Deactivated successfully. Feb 20 03:07:53 localhost podman[65734]: 2026-02-20 08:07:53.166482817 +0000 UTC m=+0.102283336 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:07:53 localhost podman[65734]: 2026-02-20 08:07:53.36830248 +0000 UTC m=+0.304103009 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:07:53 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:07:55 localhost sshd[65779]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:07:58 localhost sshd[65781]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:01 localhost sshd[65783]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:08:03 localhost systemd[1]: tmp-crun.eyybr9.mount: Deactivated successfully. Feb 20 03:08:03 localhost podman[65785]: 2026-02-20 08:08:03.159521363 +0000 UTC m=+0.096933540 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 20 03:08:03 localhost podman[65785]: 2026-02-20 08:08:03.171068089 +0000 UTC m=+0.108480266 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd) Feb 20 03:08:03 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:08:04 localhost sshd[65806]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:06 localhost sshd[65808]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:08:06 localhost podman[65809]: 2026-02-20 08:08:06.229792261 +0000 UTC m=+0.042769349 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:08:06 localhost podman[65809]: 2026-02-20 08:08:06.235038983 +0000 UTC m=+0.048016081 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, vcs-type=git, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510) Feb 20 03:08:06 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:08:11 localhost sshd[65830]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:13 localhost sshd[65832]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:17 localhost sshd[65834]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:21 localhost sshd[65836]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:08:23 localhost systemd[1]: tmp-crun.eeBItp.mount: Deactivated successfully. Feb 20 03:08:23 localhost podman[65838]: 2026-02-20 08:08:23.995271865 +0000 UTC m=+0.092215033 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:08:24 localhost podman[65838]: 2026-02-20 08:08:24.174252975 +0000 UTC m=+0.271196183 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:08:24 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:08:25 localhost sshd[65866]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:30 localhost sshd[65868]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:32 localhost sshd[65870]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:08:33 localhost podman[65872]: 2026-02-20 08:08:33.965229478 +0000 UTC m=+0.087943012 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true) Feb 20 03:08:33 localhost podman[65872]: 2026-02-20 08:08:33.974872806 +0000 UTC m=+0.097586340 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:08:33 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:08:34 localhost sshd[65893]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:08:36 localhost podman[65895]: 2026-02-20 08:08:36.959069419 +0000 UTC m=+0.076685445 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z) Feb 20 03:08:36 localhost podman[65895]: 2026-02-20 08:08:36.968571703 +0000 UTC m=+0.086187759 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container) Feb 20 03:08:36 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:08:37 localhost sshd[65916]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:37 localhost sshd[65917]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:40 localhost sshd[65920]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:41 localhost sshd[65922]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:46 localhost sshd[65924]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:50 localhost sshd[65926]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:08:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:08:54 localhost podman[65928]: 2026-02-20 08:08:54.538601759 +0000 UTC m=+0.098767777 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 20 03:08:54 localhost podman[65928]: 2026-02-20 08:08:54.754135535 +0000 UTC m=+0.314301563 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:08:54 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:08:56 localhost sshd[66068]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:00 localhost sshd[66070]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:02 localhost sshd[66088]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:09:04 localhost systemd[1]: tmp-crun.U7QPFW.mount: Deactivated successfully. Feb 20 03:09:04 localhost podman[66090]: 2026-02-20 08:09:04.17478213 +0000 UTC m=+0.110988053 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, release=1766032510, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd) Feb 20 03:09:04 localhost podman[66090]: 2026-02-20 08:09:04.188004278 +0000 UTC m=+0.124210191 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:09:04 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:09:05 localhost sshd[66110]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:09:07 localhost sshd[66112]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:07 localhost podman[66113]: 2026-02-20 08:09:07.140192154 +0000 UTC m=+0.078070299 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:09:07 localhost podman[66113]: 2026-02-20 08:09:07.153302478 +0000 UTC m=+0.091180633 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:09:07 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:09:10 localhost sshd[66134]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:14 localhost sshd[66136]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:16 localhost sshd[66138]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:18 localhost python3[66187]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:19 localhost python3[66232]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574958.2871263-107660-72029222473044/source _original_basename=tmpkdj14tbp follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:20 localhost python3[66294]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:20 localhost python3[66337]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574959.9982226-107758-206776131067677/source _original_basename=tmp9141h3ir follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:20 localhost sshd[66338]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:21 localhost python3[66401]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:21 localhost python3[66444]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574961.0563428-107808-178098317184402/source _original_basename=tmp2iqg9iv5 follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:22 localhost python3[66506]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:22 localhost sshd[66550]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:22 localhost python3[66549]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574962.0301127-108022-30166966606923/source _original_basename=tmpuj_1ldb7 follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:23 localhost python3[66581]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 20 03:09:23 localhost systemd[1]: Reloading. Feb 20 03:09:23 localhost systemd-sysv-generator[66606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:23 localhost systemd-rc-local-generator[66602]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:23 localhost systemd[1]: Reloading. Feb 20 03:09:23 localhost systemd-sysv-generator[66647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:23 localhost systemd-rc-local-generator[66644]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:24 localhost sshd[66672]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:24 localhost python3[66671]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:09:24 localhost systemd[1]: Reloading. Feb 20 03:09:24 localhost systemd-rc-local-generator[66700]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:24 localhost systemd-sysv-generator[66704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:09:24 localhost systemd[1]: Reloading. Feb 20 03:09:25 localhost podman[66711]: 2026-02-20 08:09:25.011946875 +0000 UTC m=+0.095156175 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=) Feb 20 03:09:25 localhost systemd-sysv-generator[66765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:25 localhost systemd-rc-local-generator[66762]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:25 localhost podman[66711]: 2026-02-20 08:09:25.195558778 +0000 UTC m=+0.278767988 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1) Feb 20 03:09:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:09:25 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Feb 20 03:09:25 localhost python3[66793]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 03:09:25 localhost systemd[1]: Reloading. Feb 20 03:09:25 localhost systemd-rc-local-generator[66817]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:25 localhost systemd-sysv-generator[66823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:26 localhost python3[66877]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:26 localhost python3[66920]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574966.126908-108187-151158764478628/source _original_basename=tmpn1vjzgxz follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:27 localhost python3[66950]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:09:27 localhost systemd[1]: Reloading. Feb 20 03:09:27 localhost systemd-rc-local-generator[66972]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:27 localhost systemd-sysv-generator[66976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:27 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Feb 20 03:09:27 localhost sshd[66989]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:28 localhost python3[67005]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:09:29 localhost ansible-async_wrapper.py[67178]: Invoked with 933272446365 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574969.1751366-108390-222623852624805/AnsiballZ_command.py _ Feb 20 03:09:29 localhost ansible-async_wrapper.py[67181]: Starting module and watcher Feb 20 03:09:29 localhost ansible-async_wrapper.py[67181]: Start watching 67182 (3600) Feb 20 03:09:29 localhost ansible-async_wrapper.py[67182]: Start module (67182) Feb 20 03:09:29 localhost ansible-async_wrapper.py[67178]: Return async_wrapper task started. Feb 20 03:09:29 localhost sshd[67198]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:30 localhost python3[67204]: ansible-ansible.legacy.async_status Invoked with jid=933272446365.67178 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:09:33 localhost puppet-user[67203]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 03:09:33 localhost puppet-user[67203]: (file: /etc/puppet/hiera.yaml) Feb 20 03:09:33 localhost puppet-user[67203]: Warning: Undefined variable '::deploy_config_name'; Feb 20 03:09:33 localhost puppet-user[67203]: (file & line not available) Feb 20 03:09:33 localhost puppet-user[67203]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 03:09:33 localhost puppet-user[67203]: (file & line not available) Feb 20 03:09:33 localhost puppet-user[67203]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 20 03:09:33 localhost puppet-user[67203]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:09:33 localhost puppet-user[67203]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:09:33 localhost sshd[67314]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:33 localhost puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:09:33 localhost puppet-user[67203]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:09:33 localhost puppet-user[67203]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:09:33 localhost puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:09:33 localhost puppet-user[67203]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:09:33 localhost puppet-user[67203]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:09:33 localhost puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:09:33 localhost puppet-user[67203]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:09:33 localhost puppet-user[67203]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:09:33 localhost puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:09:33 localhost puppet-user[67203]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:09:33 localhost puppet-user[67203]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:09:33 localhost puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:09:33 localhost puppet-user[67203]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:09:33 localhost puppet-user[67203]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:09:33 localhost puppet-user[67203]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 20 03:09:33 localhost puppet-user[67203]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.27 seconds Feb 20 03:09:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:09:34 localhost systemd[1]: tmp-crun.VSDleg.mount: Deactivated successfully. Feb 20 03:09:34 localhost podman[67324]: 2026-02-20 08:09:34.698731867 +0000 UTC m=+0.105434712 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3) Feb 20 03:09:34 localhost podman[67324]: 2026-02-20 08:09:34.712985526 +0000 UTC m=+0.119688351 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:09:34 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:09:34 localhost ansible-async_wrapper.py[67181]: 67182 still running (3600) Feb 20 03:09:35 localhost sshd[67344]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:09:38 localhost systemd[1]: tmp-crun.7MdOc4.mount: Deactivated successfully. Feb 20 03:09:38 localhost podman[67365]: 2026-02-20 08:09:38.15116062 +0000 UTC m=+0.081487015 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:09:38 localhost podman[67365]: 2026-02-20 08:09:38.1900999 +0000 UTC m=+0.120426255 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, tcib_managed=true, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container) Feb 20 03:09:38 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:09:39 localhost ansible-async_wrapper.py[67181]: 67182 still running (3595) Feb 20 03:09:40 localhost python3[67447]: ansible-ansible.legacy.async_status Invoked with jid=933272446365.67178 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:09:41 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 03:09:41 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 03:09:41 localhost systemd[1]: Reloading. Feb 20 03:09:41 localhost sshd[67462]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:41 localhost systemd-rc-local-generator[67495]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:41 localhost systemd-sysv-generator[67498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:41 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 03:09:41 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 03:09:41 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 03:09:41 localhost systemd[1]: man-db-cache-update.service: Consumed 1.001s CPU time. Feb 20 03:09:41 localhost systemd[1]: run-rc7dd249622d24037bd01b1ac0dc99c51.service: Deactivated successfully. Feb 20 03:09:42 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Feb 20 03:09:42 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}6efe8182ad743f857ee22a2729211e2ffe44f4518a2bdbc1aeccaa84211394dc' Feb 20 03:09:42 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Feb 20 03:09:42 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Feb 20 03:09:42 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Feb 20 03:09:42 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Feb 20 03:09:43 localhost sshd[68549]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:44 localhost ansible-async_wrapper.py[67181]: 67182 still running (3590) Feb 20 03:09:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:09:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4463 writes, 20K keys, 4463 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4463 writes, 468 syncs, 9.54 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 227 writes, 681 keys, 227 commit groups, 1.0 writes per commit group, ingest: 0.63 MB, 0.00 MB/s#012Interval WAL: 227 writes, 110 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:09:47 localhost puppet-user[67203]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Feb 20 03:09:47 localhost systemd[1]: Reloading. Feb 20 03:09:47 localhost systemd-rc-local-generator[68577]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:47 localhost systemd-sysv-generator[68583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:48 localhost sshd[68591]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:48 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Feb 20 03:09:48 localhost snmpd[68593]: Can't find directory of RPM packages Feb 20 03:09:48 localhost snmpd[68593]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Feb 20 03:09:48 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Feb 20 03:09:48 localhost systemd[1]: Reloading. Feb 20 03:09:48 localhost systemd-rc-local-generator[68619]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:48 localhost systemd-sysv-generator[68624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:48 localhost systemd[1]: Reloading. Feb 20 03:09:48 localhost systemd-sysv-generator[68660]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:48 localhost systemd-rc-local-generator[68657]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:09:48 localhost puppet-user[67203]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Feb 20 03:09:48 localhost puppet-user[67203]: Notice: Applied catalog in 15.10 seconds Feb 20 03:09:48 localhost puppet-user[67203]: Application: Feb 20 03:09:48 localhost puppet-user[67203]: Initial environment: production Feb 20 03:09:48 localhost puppet-user[67203]: Converged environment: production Feb 20 03:09:48 localhost puppet-user[67203]: Run mode: user Feb 20 03:09:48 localhost puppet-user[67203]: Changes: Feb 20 03:09:48 localhost puppet-user[67203]: Total: 8 Feb 20 03:09:48 localhost puppet-user[67203]: Events: Feb 20 03:09:48 localhost puppet-user[67203]: Success: 8 Feb 20 03:09:48 localhost puppet-user[67203]: Total: 8 Feb 20 03:09:48 localhost puppet-user[67203]: Resources: Feb 20 03:09:48 localhost puppet-user[67203]: Restarted: 1 Feb 20 03:09:48 localhost puppet-user[67203]: Changed: 8 Feb 20 03:09:48 localhost puppet-user[67203]: Out of sync: 8 Feb 20 03:09:48 localhost puppet-user[67203]: Total: 19 Feb 20 03:09:48 localhost puppet-user[67203]: Time: Feb 20 03:09:48 localhost puppet-user[67203]: Filebucket: 0.00 Feb 20 03:09:48 localhost puppet-user[67203]: Schedule: 0.00 Feb 20 03:09:48 localhost puppet-user[67203]: Augeas: 0.01 Feb 20 03:09:48 localhost puppet-user[67203]: File: 0.07 Feb 20 03:09:48 localhost puppet-user[67203]: Config retrieval: 0.33 Feb 20 03:09:48 localhost puppet-user[67203]: Service: 1.15 Feb 20 03:09:48 localhost puppet-user[67203]: Transaction evaluation: 15.09 Feb 20 03:09:48 localhost puppet-user[67203]: Catalog application: 15.10 Feb 20 03:09:48 localhost puppet-user[67203]: Last run: 1771574988 Feb 20 03:09:48 localhost puppet-user[67203]: Exec: 5.06 Feb 20 03:09:48 localhost puppet-user[67203]: Package: 8.64 Feb 20 03:09:48 localhost puppet-user[67203]: Total: 15.10 Feb 20 03:09:48 localhost puppet-user[67203]: Version: Feb 20 03:09:48 localhost puppet-user[67203]: Config: 1771574973 Feb 20 03:09:48 localhost puppet-user[67203]: Puppet: 7.10.0 Feb 20 03:09:48 localhost ansible-async_wrapper.py[67182]: Module complete (67182) Feb 20 03:09:49 localhost ansible-async_wrapper.py[67181]: Done in kid B. Feb 20 03:09:50 localhost python3[68683]: ansible-ansible.legacy.async_status Invoked with jid=933272446365.67178 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:09:51 localhost python3[68699]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 03:09:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:09:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 5194 writes, 22K keys, 5194 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5194 writes, 621 syncs, 8.36 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 287 writes, 708 keys, 287 commit groups, 1.0 writes per commit group, ingest: 0.58 MB, 0.00 MB/s#012Interval WAL: 287 writes, 141 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:09:51 localhost sshd[68700]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:51 localhost python3[68717]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:09:52 localhost python3[68767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:52 localhost python3[68785]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpwkq6eo1o recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 03:09:53 localhost python3[68815]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:54 localhost python3[68918]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 20 03:09:54 localhost python3[68937]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:09:55 localhost podman[68970]: 2026-02-20 08:09:55.866340781 +0000 UTC m=+0.095570228 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1) Feb 20 03:09:55 localhost python3[68969]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:09:56 localhost podman[68970]: 2026-02-20 08:09:56.069045593 +0000 UTC m=+0.298275040 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510) Feb 20 03:09:56 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:09:56 localhost sshd[69048]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:56 localhost python3[69047]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:56 localhost python3[69067]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:57 localhost python3[69129]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:57 localhost python3[69147]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:58 localhost python3[69209]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:58 localhost python3[69227]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:58 localhost python3[69289]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:09:59 localhost sshd[69308]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:09:59 localhost python3[69307]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:09:59 localhost python3[69339]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:09:59 localhost systemd[1]: Reloading. Feb 20 03:09:59 localhost systemd-rc-local-generator[69361]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:09:59 localhost systemd-sysv-generator[69366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:09:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:00 localhost python3[69425]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:10:00 localhost python3[69443]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:01 localhost python3[69535]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:10:01 localhost python3[69574]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:02 localhost sshd[69647]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:02 localhost python3[69634]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:02 localhost systemd[1]: Reloading. Feb 20 03:10:02 localhost systemd-sysv-generator[69688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:02 localhost systemd-rc-local-generator[69684]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:02 localhost systemd[1]: Starting Create netns directory... Feb 20 03:10:02 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 03:10:02 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 03:10:02 localhost systemd[1]: Finished Create netns directory. Feb 20 03:10:03 localhost python3[69755]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 20 03:10:03 localhost podman[69796]: Feb 20 03:10:03 localhost podman[69796]: 2026-02-20 08:10:03.346467066 +0000 UTC m=+0.076107858 container create 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.buildah.version=1.42.2, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:10:03 localhost systemd[1]: Started libpod-conmon-8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6.scope. Feb 20 03:10:03 localhost systemd[1]: Started libcrun container. Feb 20 03:10:03 localhost podman[69796]: 2026-02-20 08:10:03.314427428 +0000 UTC m=+0.044068260 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 03:10:03 localhost podman[69796]: 2026-02-20 08:10:03.418945711 +0000 UTC m=+0.148586553 container init 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, ceph=True, io.openshift.tags=rhceph ceph, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 03:10:03 localhost podman[69796]: 2026-02-20 08:10:03.429256789 +0000 UTC m=+0.158897581 container start 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, name=rhceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Feb 20 03:10:03 localhost podman[69796]: 2026-02-20 08:10:03.429522538 +0000 UTC m=+0.159163370 container attach 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64) Feb 20 03:10:03 localhost funny_pasteur[69809]: 167 167 Feb 20 03:10:03 localhost systemd[1]: libpod-8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6.scope: Deactivated successfully. Feb 20 03:10:03 localhost podman[69796]: 2026-02-20 08:10:03.433344655 +0000 UTC m=+0.162985447 container died 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, name=rhceph, ceph=True, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=) Feb 20 03:10:03 localhost sshd[69813]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:03 localhost podman[69816]: 2026-02-20 08:10:03.517084747 +0000 UTC m=+0.075528600 container remove 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, release=1770267347, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 03:10:03 localhost systemd[1]: libpod-conmon-8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6.scope: Deactivated successfully. Feb 20 03:10:03 localhost systemd[1]: var-lib-containers-storage-overlay-affb0363b6af858faf35f8b44ac482767e10653be43d1f4616d5950413a5bebc-merged.mount: Deactivated successfully. Feb 20 03:10:03 localhost podman[69838]: Feb 20 03:10:03 localhost podman[69838]: 2026-02-20 08:10:03.739042553 +0000 UTC m=+0.076520862 container create a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, release=1770267347, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 03:10:03 localhost systemd[1]: Started libpod-conmon-a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d.scope. Feb 20 03:10:03 localhost podman[69838]: 2026-02-20 08:10:03.708002945 +0000 UTC m=+0.045481324 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 03:10:03 localhost systemd[1]: Started libcrun container. Feb 20 03:10:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:03 localhost podman[69838]: 2026-02-20 08:10:03.826683775 +0000 UTC m=+0.164162134 container init a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, io.buildah.version=1.42.2, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph) Feb 20 03:10:03 localhost podman[69838]: 2026-02-20 08:10:03.839144879 +0000 UTC m=+0.176623188 container start a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.42.2, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 03:10:03 localhost podman[69838]: 2026-02-20 08:10:03.842386099 +0000 UTC m=+0.179864418 container attach a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=) Feb 20 03:10:04 localhost confident_gauss[69869]: [ Feb 20 03:10:04 localhost confident_gauss[69869]: { Feb 20 03:10:04 localhost confident_gauss[69869]: "available": false, Feb 20 03:10:04 localhost confident_gauss[69869]: "ceph_device": false, Feb 20 03:10:04 localhost confident_gauss[69869]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 20 03:10:04 localhost confident_gauss[69869]: "lsm_data": {}, Feb 20 03:10:04 localhost confident_gauss[69869]: "lvs": [], Feb 20 03:10:04 localhost confident_gauss[69869]: "path": "/dev/sr0", Feb 20 03:10:04 localhost confident_gauss[69869]: "rejected_reasons": [ Feb 20 03:10:04 localhost confident_gauss[69869]: "Insufficient space (<5GB)", Feb 20 03:10:04 localhost confident_gauss[69869]: "Has a FileSystem" Feb 20 03:10:04 localhost confident_gauss[69869]: ], Feb 20 03:10:04 localhost confident_gauss[69869]: "sys_api": { Feb 20 03:10:04 localhost confident_gauss[69869]: "actuators": null, Feb 20 03:10:04 localhost confident_gauss[69869]: "device_nodes": "sr0", Feb 20 03:10:04 localhost confident_gauss[69869]: "human_readable_size": "482.00 KB", Feb 20 03:10:04 localhost confident_gauss[69869]: "id_bus": "ata", Feb 20 03:10:04 localhost confident_gauss[69869]: "model": "QEMU DVD-ROM", Feb 20 03:10:04 localhost confident_gauss[69869]: "nr_requests": "2", Feb 20 03:10:04 localhost confident_gauss[69869]: "partitions": {}, Feb 20 03:10:04 localhost confident_gauss[69869]: "path": "/dev/sr0", Feb 20 03:10:04 localhost confident_gauss[69869]: "removable": "1", Feb 20 03:10:04 localhost confident_gauss[69869]: "rev": "2.5+", Feb 20 03:10:04 localhost confident_gauss[69869]: "ro": "0", Feb 20 03:10:04 localhost confident_gauss[69869]: "rotational": "1", Feb 20 03:10:04 localhost confident_gauss[69869]: "sas_address": "", Feb 20 03:10:04 localhost confident_gauss[69869]: "sas_device_handle": "", Feb 20 03:10:04 localhost confident_gauss[69869]: "scheduler_mode": "mq-deadline", Feb 20 03:10:04 localhost confident_gauss[69869]: "sectors": 0, Feb 20 03:10:04 localhost confident_gauss[69869]: "sectorsize": "2048", Feb 20 03:10:04 localhost confident_gauss[69869]: "size": 493568.0, Feb 20 03:10:04 localhost confident_gauss[69869]: "support_discard": "0", Feb 20 03:10:04 localhost confident_gauss[69869]: "type": "disk", Feb 20 03:10:04 localhost confident_gauss[69869]: "vendor": "QEMU" Feb 20 03:10:04 localhost confident_gauss[69869]: } Feb 20 03:10:04 localhost confident_gauss[69869]: } Feb 20 03:10:04 localhost confident_gauss[69869]: ] Feb 20 03:10:04 localhost systemd[1]: libpod-a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d.scope: Deactivated successfully. Feb 20 03:10:04 localhost podman[69838]: 2026-02-20 08:10:04.805088646 +0000 UTC m=+1.142566965 container died a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, release=1770267347, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.42.2, vcs-type=git) Feb 20 03:10:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:10:04 localhost systemd[1]: var-lib-containers-storage-overlay-bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502-merged.mount: Deactivated successfully. Feb 20 03:10:04 localhost podman[71720]: 2026-02-20 08:10:04.875253219 +0000 UTC m=+0.064226141 container remove a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 03:10:04 localhost systemd[1]: libpod-conmon-a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d.scope: Deactivated successfully. Feb 20 03:10:04 localhost podman[71727]: 2026-02-20 08:10:04.928612695 +0000 UTC m=+0.094885857 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:10:04 localhost podman[71727]: 2026-02-20 08:10:04.940746839 +0000 UTC m=+0.107020001 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:10:04 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:10:05 localhost python3[71772]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 20 03:10:05 localhost sshd[71878]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:05 localhost podman[71934]: 2026-02-20 08:10:05.467506333 +0000 UTC m=+0.060081264 container create eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:10:05 localhost systemd[1]: Started libpod-conmon-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3.scope. Feb 20 03:10:05 localhost podman[71953]: 2026-02-20 08:10:05.50470875 +0000 UTC m=+0.074664534 container create 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4) Feb 20 03:10:05 localhost systemd[1]: Started libcrun container. Feb 20 03:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:05 localhost podman[71934]: 2026-02-20 08:10:05.526320657 +0000 UTC m=+0.118895588 container init eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.) Feb 20 03:10:05 localhost systemd[1]: Started libpod-conmon-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.scope. Feb 20 03:10:05 localhost podman[71934]: 2026-02-20 08:10:05.438811808 +0000 UTC m=+0.031386759 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 20 03:10:05 localhost podman[71934]: 2026-02-20 08:10:05.538328427 +0000 UTC m=+0.130903368 container start eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, managed_by=tripleo_ansible) Feb 20 03:10:05 localhost podman[71934]: 2026-02-20 08:10:05.538611785 +0000 UTC m=+0.131186737 container attach eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_libvirt_init_secret, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container) Feb 20 03:10:05 localhost systemd[1]: Started libcrun container. Feb 20 03:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b03ed83be81af8ca31d355d34bc84741adbeedeb0b33580fe27349115e799d7/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:05 localhost podman[71935]: 2026-02-20 08:10:05.461337582 +0000 UTC m=+0.046658459 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 20 03:10:05 localhost podman[71949]: 2026-02-20 08:10:05.46546459 +0000 UTC m=+0.040042486 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 20 03:10:05 localhost podman[71953]: 2026-02-20 08:10:05.466151051 +0000 UTC m=+0.036106845 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 20 03:10:05 localhost podman[71949]: 2026-02-20 08:10:05.565779154 +0000 UTC m=+0.140357050 container create 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, container_name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, release=1766032510) Feb 20 03:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:10:05 localhost podman[71953]: 2026-02-20 08:10:05.572586373 +0000 UTC m=+0.142542197 container init 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, architecture=x86_64) Feb 20 03:10:05 localhost podman[71954]: 2026-02-20 08:10:05.475415876 +0000 UTC m=+0.040463488 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 20 03:10:05 localhost podman[71935]: 2026-02-20 08:10:05.576526434 +0000 UTC m=+0.161847341 container create cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, container_name=ceilometer_agent_compute) Feb 20 03:10:05 localhost podman[71954]: 2026-02-20 08:10:05.589819154 +0000 UTC m=+0.154866756 container create 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 20 03:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:10:05 localhost systemd[1]: Started libpod-conmon-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39.scope. Feb 20 03:10:05 localhost podman[71953]: 2026-02-20 08:10:05.603092474 +0000 UTC m=+0.173048268 container start 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, version=17.1.13, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:10:05 localhost systemd[1]: Started libpod-conmon-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope. Feb 20 03:10:05 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 20 03:10:05 localhost systemd[1]: Started libcrun container. Feb 20 03:10:05 localhost systemd[1]: Started libcrun container. Feb 20 03:10:05 localhost systemd[1]: Started libpod-conmon-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope. Feb 20 03:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1c07b1bd08758bd14fb80cc901f6da6a3ccc5e5eba94f04ead08e95db5f3037/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:05 localhost systemd[1]: Started libcrun container. Feb 20 03:10:05 localhost podman[71949]: 2026-02-20 08:10:05.640355083 +0000 UTC m=+0.214932979 container init 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=configure_cms_options, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com) Feb 20 03:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/271fbe47d50a90f03735a26a1ff5b20e2027c13cb6e9d5c8a6a9112793cd7c92/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:05 localhost podman[71949]: 2026-02-20 08:10:05.647085871 +0000 UTC m=+0.221663757 container start 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=configure_cms_options, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}) Feb 20 03:10:05 localhost podman[71949]: 2026-02-20 08:10:05.647286137 +0000 UTC m=+0.221864033 container attach 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, container_name=configure_cms_options, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, build-date=2026-01-12T22:36:40Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-type=git) Feb 20 03:10:05 localhost systemd[1]: libpod-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3.scope: Deactivated successfully. Feb 20 03:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:10:05 localhost podman[71954]: 2026-02-20 08:10:05.668862452 +0000 UTC m=+0.233910104 container init 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public) Feb 20 03:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:10:05 localhost podman[71954]: 2026-02-20 08:10:05.683145652 +0000 UTC m=+0.248193264 container start 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1) Feb 20 03:10:05 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ed809cd151e1fa8da7409fe229c809b7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 20 03:10:05 localhost podman[71934]: 2026-02-20 08:10:05.70347175 +0000 UTC m=+0.296046711 container died eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_libvirt_init_secret, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z) Feb 20 03:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:10:05 localhost podman[71935]: 2026-02-20 08:10:05.713436106 +0000 UTC m=+0.298756983 container init cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:10:05 localhost podman[71935]: 2026-02-20 08:10:05.739887752 +0000 UTC m=+0.325208619 container start cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z) Feb 20 03:10:05 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ed809cd151e1fa8da7409fe229c809b7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 20 03:10:05 localhost ovs-vsctl[72155]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Feb 20 03:10:05 localhost podman[72030]: 2026-02-20 08:10:05.824370467 +0000 UTC m=+0.217905320 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond) Feb 20 03:10:05 localhost systemd[1]: libpod-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39.scope: Deactivated successfully. Feb 20 03:10:05 localhost podman[72030]: 2026-02-20 08:10:05.833141738 +0000 UTC m=+0.226676601 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:10:05 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:10:05 localhost podman[71949]: 2026-02-20 08:10:05.884391138 +0000 UTC m=+0.458969024 container died 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, container_name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 20 03:10:05 localhost podman[72121]: 2026-02-20 08:10:05.8977512 +0000 UTC m=+0.158599902 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, config_id=tripleo_step4, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:10:05 localhost podman[72174]: 2026-02-20 08:10:05.953246922 +0000 UTC m=+0.109311882 container cleanup 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public) Feb 20 03:10:05 localhost systemd[1]: libpod-conmon-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39.scope: Deactivated successfully. Feb 20 03:10:05 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Feb 20 03:10:05 localhost podman[72066]: 2026-02-20 08:10:05.989875441 +0000 UTC m=+0.328168721 container cleanup eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z) Feb 20 03:10:05 localhost podman[72080]: 2026-02-20 08:10:05.99276393 +0000 UTC m=+0.307523594 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:10:06 localhost systemd[1]: libpod-conmon-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3.scope: Deactivated successfully. Feb 20 03:10:06 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Feb 20 03:10:06 localhost podman[72121]: 2026-02-20 08:10:06.03068843 +0000 UTC m=+0.291537152 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true) Feb 20 03:10:06 localhost podman[72121]: unhealthy Feb 20 03:10:06 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:10:06 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed with result 'exit-code'. Feb 20 03:10:06 localhost podman[72080]: 2026-02-20 08:10:06.05569806 +0000 UTC m=+0.370457734 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:10:06 localhost podman[72080]: unhealthy Feb 20 03:10:06 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:10:06 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed with result 'exit-code'. Feb 20 03:10:06 localhost podman[72230]: 2026-02-20 08:10:05.970031799 +0000 UTC m=+0.032033839 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:10:06 localhost podman[72230]: 2026-02-20 08:10:06.110501301 +0000 UTC m=+0.172503341 container create b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public) Feb 20 03:10:06 localhost podman[72305]: 2026-02-20 08:10:06.135410819 +0000 UTC m=+0.055190663 container create 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=setup_ovs_manager, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4) Feb 20 03:10:06 localhost systemd[1]: Started libpod-conmon-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope. Feb 20 03:10:06 localhost systemd[1]: Started libpod-conmon-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope. Feb 20 03:10:06 localhost systemd[1]: Started libcrun container. Feb 20 03:10:06 localhost systemd[1]: Started libcrun container. Feb 20 03:10:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:06 localhost podman[72305]: 2026-02-20 08:10:06.191387935 +0000 UTC m=+0.111167779 container init 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, container_name=setup_ovs_manager, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 20 03:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:10:06 localhost podman[72230]: 2026-02-20 08:10:06.195268875 +0000 UTC m=+0.257270935 container init b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target) Feb 20 03:10:06 localhost podman[72305]: 2026-02-20 08:10:06.201528318 +0000 UTC m=+0.121308132 container start 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=setup_ovs_manager, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 20 03:10:06 localhost podman[72305]: 2026-02-20 08:10:06.201764856 +0000 UTC m=+0.121544720 container attach 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=setup_ovs_manager, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true) Feb 20 03:10:06 localhost podman[72305]: 2026-02-20 08:10:06.103547667 +0000 UTC m=+0.023327471 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 20 03:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:10:06 localhost podman[72230]: 2026-02-20 08:10:06.235608439 +0000 UTC m=+0.297610479 container start b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:10:06 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:10:06 localhost podman[72348]: 2026-02-20 08:10:06.298853689 +0000 UTC m=+0.056854104 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container) Feb 20 03:10:06 localhost systemd[1]: var-lib-containers-storage-overlay-33265afbb0ab1192cc35fd8be9e517c4969c8f23f7a1676738a90556ed12fe7c-merged.mount: Deactivated successfully. Feb 20 03:10:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39-userdata-shm.mount: Deactivated successfully. Feb 20 03:10:06 localhost systemd[1]: var-lib-containers-storage-overlay-9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207-merged.mount: Deactivated successfully. Feb 20 03:10:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3-userdata-shm.mount: Deactivated successfully. Feb 20 03:10:06 localhost podman[72348]: 2026-02-20 08:10:06.653303809 +0000 UTC m=+0.411304284 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4) Feb 20 03:10:06 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:10:07 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Feb 20 03:10:08 localhost sshd[72528]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:10:08 localhost sshd[72546]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:08 localhost ovs-vsctl[72547]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Feb 20 03:10:08 localhost systemd[1]: tmp-crun.H0m1wQ.mount: Deactivated successfully. Feb 20 03:10:08 localhost podman[72534]: 2026-02-20 08:10:08.949563449 +0000 UTC m=+0.131473535 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Feb 20 03:10:08 localhost podman[72534]: 2026-02-20 08:10:08.987960253 +0000 UTC m=+0.169870309 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:10:08 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:10:09 localhost systemd[1]: libpod-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope: Deactivated successfully. Feb 20 03:10:09 localhost systemd[1]: libpod-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope: Consumed 2.827s CPU time. Feb 20 03:10:09 localhost podman[72559]: 2026-02-20 08:10:09.173525115 +0000 UTC m=+0.058391051 container died 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=setup_ovs_manager, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 20 03:10:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a-userdata-shm.mount: Deactivated successfully. Feb 20 03:10:09 localhost podman[72559]: 2026-02-20 08:10:09.209087582 +0000 UTC m=+0.093953468 container cleanup 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=setup_ovs_manager, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 20 03:10:09 localhost systemd[1]: libpod-conmon-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope: Deactivated successfully. Feb 20 03:10:09 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Feb 20 03:10:09 localhost podman[72672]: 2026-02-20 08:10:09.684043659 +0000 UTC m=+0.073002692 container create 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 20 03:10:09 localhost systemd[1]: Started libpod-conmon-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope. Feb 20 03:10:09 localhost podman[72678]: 2026-02-20 08:10:09.728168539 +0000 UTC m=+0.103698499 container create 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:10:09 localhost systemd[1]: Started libcrun container. Feb 20 03:10:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:09 localhost podman[72672]: 2026-02-20 08:10:09.645690106 +0000 UTC m=+0.034649199 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 20 03:10:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:09 localhost systemd[1]: Started libpod-conmon-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope. Feb 20 03:10:09 localhost podman[72678]: 2026-02-20 08:10:09.678057034 +0000 UTC m=+0.053587044 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 20 03:10:09 localhost systemd[1]: Started libcrun container. Feb 20 03:10:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Feb 20 03:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:10:09 localhost podman[72672]: 2026-02-20 08:10:09.788112138 +0000 UTC m=+0.177071211 container init 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=) Feb 20 03:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:10:09 localhost podman[72672]: 2026-02-20 08:10:09.832582659 +0000 UTC m=+0.221541712 container start 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, managed_by=tripleo_ansible) Feb 20 03:10:09 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 20 03:10:09 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:10:09 localhost podman[72678]: 2026-02-20 08:10:09.84007436 +0000 UTC m=+0.215604360 container init 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:10:09 localhost systemd[1]: Created slice User Slice of UID 0. Feb 20 03:10:09 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 20 03:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:10:09 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 20 03:10:09 localhost systemd[1]: Starting User Manager for UID 0... Feb 20 03:10:09 localhost podman[72678]: 2026-02-20 08:10:09.88124944 +0000 UTC m=+0.256779430 container start 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 20 03:10:09 localhost python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=684ebb6e94768a0a31a4d8592f0686b3 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 20 03:10:09 localhost systemd[1]: var-lib-containers-storage-overlay-ef9cc1375c4a3e979779fde9a22d44caa1f8d54d9be8e432ea85c98c54294ad4-merged.mount: Deactivated successfully. Feb 20 03:10:09 localhost podman[72711]: 2026-02-20 08:10:09.945286854 +0000 UTC m=+0.104797292 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:10:09 localhost podman[72711]: 2026-02-20 08:10:09.962049601 +0000 UTC m=+0.121560029 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 20 03:10:09 localhost podman[72711]: unhealthy Feb 20 03:10:09 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:10:09 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:10:10 localhost podman[72729]: 2026-02-20 08:10:10.04242155 +0000 UTC m=+0.156557619 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5) Feb 20 03:10:10 localhost systemd[72731]: Queued start job for default target Main User Target. Feb 20 03:10:10 localhost systemd[72731]: Created slice User Application Slice. Feb 20 03:10:10 localhost systemd[72731]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 20 03:10:10 localhost systemd[72731]: Started Daily Cleanup of User's Temporary Directories. Feb 20 03:10:10 localhost systemd[72731]: Reached target Paths. Feb 20 03:10:10 localhost systemd[72731]: Reached target Timers. Feb 20 03:10:10 localhost systemd[72731]: Starting D-Bus User Message Bus Socket... Feb 20 03:10:10 localhost systemd[72731]: Starting Create User's Volatile Files and Directories... Feb 20 03:10:10 localhost systemd[72731]: Finished Create User's Volatile Files and Directories. Feb 20 03:10:10 localhost systemd[72731]: Listening on D-Bus User Message Bus Socket. Feb 20 03:10:10 localhost systemd[72731]: Reached target Sockets. Feb 20 03:10:10 localhost systemd[72731]: Reached target Basic System. Feb 20 03:10:10 localhost systemd[72731]: Reached target Main User Target. Feb 20 03:10:10 localhost systemd[72731]: Startup finished in 137ms. Feb 20 03:10:10 localhost systemd[1]: Started User Manager for UID 0. Feb 20 03:10:10 localhost systemd[1]: Started Session c9 of User root. Feb 20 03:10:10 localhost podman[72729]: 2026-02-20 08:10:10.108891589 +0000 UTC m=+0.223027678 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 20 03:10:10 localhost podman[72729]: unhealthy Feb 20 03:10:10 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:10:10 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:10:10 localhost systemd[1]: session-c9.scope: Deactivated successfully. Feb 20 03:10:10 localhost kernel: device br-int entered promiscuous mode Feb 20 03:10:10 localhost NetworkManager[5988]: [1771575010.2048] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Feb 20 03:10:10 localhost systemd-udevd[72825]: Network interface NamePolicy= disabled on kernel command line. Feb 20 03:10:10 localhost kernel: device genev_sys_6081 entered promiscuous mode Feb 20 03:10:10 localhost systemd-udevd[72830]: Network interface NamePolicy= disabled on kernel command line. Feb 20 03:10:10 localhost NetworkManager[5988]: [1771575010.2400] device (genev_sys_6081): carrier: link connected Feb 20 03:10:10 localhost NetworkManager[5988]: [1771575010.2403] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Feb 20 03:10:10 localhost python3[72849]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:10 localhost python3[72865]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:11 localhost python3[72881]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:11 localhost python3[72897]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:11 localhost python3[72913]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:11 localhost python3[72933]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:12 localhost python3[72949]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:10:12 localhost python3[72967]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:10:12 localhost python3[72985]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:10:12 localhost python3[73001]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:10:13 localhost python3[73017]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:10:13 localhost python3[73033]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:10:14 localhost python3[73094]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:14 localhost sshd[73095]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:14 localhost python3[73125]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:15 localhost python3[73155]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:15 localhost python3[73184]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:16 localhost python3[73213]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:16 localhost python3[73242]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:17 localhost python3[73258]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 03:10:17 localhost systemd[1]: Reloading. Feb 20 03:10:17 localhost systemd-rc-local-generator[73279]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:17 localhost systemd-sysv-generator[73284]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:18 localhost python3[73310]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:19 localhost systemd[1]: Reloading. Feb 20 03:10:19 localhost systemd-rc-local-generator[73336]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:19 localhost systemd-sysv-generator[73339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:19 localhost systemd[1]: Starting ceilometer_agent_compute container... Feb 20 03:10:19 localhost tripleo-start-podman-container[73350]: Creating additional drop-in dependency for "ceilometer_agent_compute" (cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a) Feb 20 03:10:19 localhost systemd[1]: Reloading. Feb 20 03:10:19 localhost systemd-rc-local-generator[73407]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:19 localhost systemd-sysv-generator[73411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:19 localhost systemd[1]: Started ceilometer_agent_compute container. Feb 20 03:10:20 localhost systemd[1]: Stopping User Manager for UID 0... Feb 20 03:10:20 localhost systemd[72731]: Activating special unit Exit the Session... Feb 20 03:10:20 localhost systemd[72731]: Stopped target Main User Target. Feb 20 03:10:20 localhost systemd[72731]: Stopped target Basic System. Feb 20 03:10:20 localhost systemd[72731]: Stopped target Paths. Feb 20 03:10:20 localhost systemd[72731]: Stopped target Sockets. Feb 20 03:10:20 localhost systemd[72731]: Stopped target Timers. Feb 20 03:10:20 localhost systemd[72731]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 03:10:20 localhost systemd[72731]: Closed D-Bus User Message Bus Socket. Feb 20 03:10:20 localhost systemd[72731]: Stopped Create User's Volatile Files and Directories. Feb 20 03:10:20 localhost systemd[72731]: Removed slice User Application Slice. Feb 20 03:10:20 localhost systemd[72731]: Reached target Shutdown. Feb 20 03:10:20 localhost systemd[72731]: Finished Exit the Session. Feb 20 03:10:20 localhost systemd[72731]: Reached target Exit the Session. Feb 20 03:10:20 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 20 03:10:20 localhost systemd[1]: Stopped User Manager for UID 0. Feb 20 03:10:20 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 20 03:10:20 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 20 03:10:20 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 20 03:10:20 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 20 03:10:20 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 20 03:10:20 localhost python3[73434]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:20 localhost systemd[1]: Reloading. Feb 20 03:10:20 localhost systemd-rc-local-generator[73459]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:20 localhost systemd-sysv-generator[73466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:20 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Feb 20 03:10:21 localhost systemd[1]: Started ceilometer_agent_ipmi container. Feb 20 03:10:21 localhost sshd[73489]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:21 localhost python3[73506]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:21 localhost systemd[1]: Reloading. Feb 20 03:10:22 localhost systemd-sysv-generator[73537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:22 localhost systemd-rc-local-generator[73531]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:22 localhost systemd[1]: Starting logrotate_crond container... Feb 20 03:10:22 localhost systemd[1]: Started logrotate_crond container. Feb 20 03:10:22 localhost python3[73572]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:23 localhost systemd[1]: Reloading. Feb 20 03:10:23 localhost systemd-rc-local-generator[73600]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:23 localhost systemd-sysv-generator[73606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:23 localhost systemd[1]: Starting nova_migration_target container... Feb 20 03:10:23 localhost systemd[1]: Started nova_migration_target container. Feb 20 03:10:24 localhost python3[73640]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:24 localhost systemd[1]: Reloading. Feb 20 03:10:24 localhost systemd-sysv-generator[73671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:24 localhost systemd-rc-local-generator[73666]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:24 localhost systemd[1]: Starting ovn_controller container... Feb 20 03:10:24 localhost tripleo-start-podman-container[73680]: Creating additional drop-in dependency for "ovn_controller" (0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850) Feb 20 03:10:24 localhost systemd[1]: Reloading. Feb 20 03:10:24 localhost systemd-rc-local-generator[73735]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:24 localhost systemd-sysv-generator[73739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:24 localhost systemd[1]: Started ovn_controller container. Feb 20 03:10:25 localhost python3[73764]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:10:25 localhost systemd[1]: Reloading. Feb 20 03:10:25 localhost systemd-rc-local-generator[73790]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:10:25 localhost systemd-sysv-generator[73793]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:10:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:10:25 localhost systemd[1]: Starting ovn_metadata_agent container... Feb 20 03:10:26 localhost systemd[1]: Started ovn_metadata_agent container. Feb 20 03:10:26 localhost sshd[73816]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:10:26 localhost podman[73848]: 2026-02-20 08:10:26.493527701 +0000 UTC m=+0.086128388 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, url=https://www.redhat.com, release=1766032510) Feb 20 03:10:26 localhost python3[73847]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:26 localhost podman[73848]: 2026-02-20 08:10:26.691002431 +0000 UTC m=+0.283603118 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:10:26 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:10:28 localhost python3[73998]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005625204 step=4 update_config_hash_only=False Feb 20 03:10:28 localhost python3[74015]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:10:28 localhost python3[74031]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 20 03:10:30 localhost sshd[74032]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:32 localhost sshd[74033]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:10:35 localhost podman[74036]: 2026-02-20 08:10:35.167883002 +0000 UTC m=+0.099789859 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:10:35 localhost podman[74036]: 2026-02-20 08:10:35.184203484 +0000 UTC m=+0.116110351 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 20 03:10:35 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:10:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:10:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:10:36 localhost systemd[1]: tmp-crun.bf5KdM.mount: Deactivated successfully. Feb 20 03:10:36 localhost podman[74057]: 2026-02-20 08:10:36.137849723 +0000 UTC m=+0.072198198 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:10:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:10:36 localhost podman[74057]: 2026-02-20 08:10:36.212438292 +0000 UTC m=+0.146786747 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:10:36 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:10:36 localhost podman[74091]: 2026-02-20 08:10:36.226180816 +0000 UTC m=+0.066246743 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true) Feb 20 03:10:36 localhost podman[74056]: 2026-02-20 08:10:36.194970044 +0000 UTC m=+0.133749986 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, architecture=x86_64, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:10:36 localhost podman[74056]: 2026-02-20 08:10:36.274075593 +0000 UTC m=+0.212855555 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:10:36 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:10:36 localhost podman[74091]: 2026-02-20 08:10:36.325754647 +0000 UTC m=+0.165820574 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=) Feb 20 03:10:36 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:10:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:10:36 localhost podman[74128]: 2026-02-20 08:10:36.883007511 +0000 UTC m=+0.089537442 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Feb 20 03:10:37 localhost podman[74128]: 2026-02-20 08:10:37.223713717 +0000 UTC m=+0.430243638 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 20 03:10:37 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:10:37 localhost sshd[74150]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:10:39 localhost podman[74152]: 2026-02-20 08:10:39.150492683 +0000 UTC m=+0.086650233 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:10:39 localhost podman[74152]: 2026-02-20 08:10:39.163568467 +0000 UTC m=+0.099726027 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git) Feb 20 03:10:39 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:10:40 localhost podman[74172]: 2026-02-20 08:10:40.130724291 +0000 UTC m=+0.067286536 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, release=1766032510) Feb 20 03:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:10:40 localhost podman[74172]: 2026-02-20 08:10:40.183318693 +0000 UTC m=+0.119880928 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:10:40 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:10:40 localhost podman[74192]: 2026-02-20 08:10:40.239543047 +0000 UTC m=+0.085429896 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z) Feb 20 03:10:40 localhost podman[74192]: 2026-02-20 08:10:40.288291339 +0000 UTC m=+0.134178198 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:10:40 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:10:41 localhost sshd[74218]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:46 localhost sshd[74220]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:48 localhost snmpd[68593]: empty variable list in _query Feb 20 03:10:48 localhost snmpd[68593]: empty variable list in _query Feb 20 03:10:51 localhost sshd[74222]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:54 localhost sshd[74224]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:10:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:10:57 localhost podman[74226]: 2026-02-20 08:10:57.146133586 +0000 UTC m=+0.079565865 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:10:57 localhost podman[74226]: 2026-02-20 08:10:57.365277733 +0000 UTC m=+0.298710032 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:10:57 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:10:57 localhost sshd[74256]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:11:05 localhost podman[74258]: 2026-02-20 08:11:05.73431542 +0000 UTC m=+0.076618734 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z) Feb 20 03:11:05 localhost podman[74258]: 2026-02-20 08:11:05.741834211 +0000 UTC m=+0.084137525 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, distribution-scope=public, container_name=collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:11:05 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:11:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:11:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:11:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:11:06 localhost podman[74296]: 2026-02-20 08:11:06.810746974 +0000 UTC m=+0.095744503 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64) Feb 20 03:11:06 localhost podman[74295]: 2026-02-20 08:11:06.86866277 +0000 UTC m=+0.153808944 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:11:06 localhost podman[74296]: 2026-02-20 08:11:06.895141476 +0000 UTC m=+0.180138985 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, release=1766032510, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:11:06 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:11:06 localhost podman[74295]: 2026-02-20 08:11:06.914146223 +0000 UTC m=+0.199292397 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:11:06 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:11:06 localhost podman[74294]: 2026-02-20 08:11:06.929229647 +0000 UTC m=+0.214732012 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 20 03:11:06 localhost podman[74294]: 2026-02-20 08:11:06.958018775 +0000 UTC m=+0.243521180 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4) Feb 20 03:11:06 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:11:07 localhost sshd[74414]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:11:08 localhost systemd[1]: tmp-crun.VDjnuj.mount: Deactivated successfully. Feb 20 03:11:08 localhost podman[74415]: 2026-02-20 08:11:08.16288251 +0000 UTC m=+0.098263672 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:11:08 localhost podman[74415]: 2026-02-20 08:11:08.530215448 +0000 UTC m=+0.465596550 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:11:08 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:11:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:11:10 localhost podman[74453]: 2026-02-20 08:11:10.132693554 +0000 UTC m=+0.075949633 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git) Feb 20 03:11:10 localhost podman[74453]: 2026-02-20 08:11:10.143617821 +0000 UTC m=+0.086873880 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:11:10 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:11:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:11:10 localhost podman[74473]: 2026-02-20 08:11:10.315228222 +0000 UTC m=+0.070312499 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ovn_controller, batch=17.1_20260112.1) Feb 20 03:11:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:11:10 localhost podman[74494]: 2026-02-20 08:11:10.405855487 +0000 UTC m=+0.072593500 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:11:10 localhost podman[74473]: 2026-02-20 08:11:10.4163222 +0000 UTC m=+0.171406437 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Feb 20 03:11:10 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:11:10 localhost podman[74494]: 2026-02-20 08:11:10.453125774 +0000 UTC m=+0.119863797 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public) Feb 20 03:11:10 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:11:13 localhost sshd[74522]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:19 localhost sshd[74524]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:21 localhost sshd[74526]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:23 localhost sshd[74528]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:27 localhost sshd[74530]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:11:28 localhost podman[74531]: 2026-02-20 08:11:28.159384502 +0000 UTC m=+0.093876727 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:11:28 localhost podman[74531]: 2026-02-20 08:11:28.392333825 +0000 UTC m=+0.326826080 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:11:28 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:11:32 localhost sshd[74562]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:11:36 localhost podman[74564]: 2026-02-20 08:11:36.144505808 +0000 UTC m=+0.084074633 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 20 03:11:36 localhost podman[74564]: 2026-02-20 08:11:36.153413233 +0000 UTC m=+0.092982068 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:11:36 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:11:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:11:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:11:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:11:37 localhost podman[74587]: 2026-02-20 08:11:37.150565722 +0000 UTC m=+0.078355527 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:11:37 localhost podman[74585]: 2026-02-20 08:11:37.205761464 +0000 UTC m=+0.140582846 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 20 03:11:37 localhost podman[74586]: 2026-02-20 08:11:37.26171497 +0000 UTC m=+0.191739854 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container) Feb 20 03:11:37 localhost podman[74585]: 2026-02-20 08:11:37.262496964 +0000 UTC m=+0.197318366 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, release=1766032510, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible) Feb 20 03:11:37 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:11:37 localhost podman[74587]: 2026-02-20 08:11:37.28316029 +0000 UTC m=+0.210950165 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:11:37 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:11:37 localhost podman[74586]: 2026-02-20 08:11:37.341153149 +0000 UTC m=+0.271178033 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron) Feb 20 03:11:37 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:11:38 localhost systemd[1]: tmp-crun.VHnWWz.mount: Deactivated successfully. Feb 20 03:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:11:39 localhost podman[74656]: 2026-02-20 08:11:39.135553103 +0000 UTC m=+0.074944162 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5) Feb 20 03:11:39 localhost sshd[74679]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:39 localhost podman[74656]: 2026-02-20 08:11:39.511349362 +0000 UTC m=+0.450740421 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:11:39 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:11:39 localhost sshd[74681]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:11:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:11:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:11:41 localhost podman[74685]: 2026-02-20 08:11:41.133669279 +0000 UTC m=+0.073455166 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible) Feb 20 03:11:41 localhost systemd[1]: tmp-crun.oe8bfj.mount: Deactivated successfully. Feb 20 03:11:41 localhost podman[74685]: 2026-02-20 08:11:41.183984231 +0000 UTC m=+0.123770078 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:11:41 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:11:41 localhost podman[74684]: 2026-02-20 08:11:41.234735876 +0000 UTC m=+0.171678155 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64) Feb 20 03:11:41 localhost podman[74683]: 2026-02-20 08:11:41.18623322 +0000 UTC m=+0.128256186 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:11:41 localhost podman[74684]: 2026-02-20 08:11:41.243951559 +0000 UTC m=+0.180893868 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1) Feb 20 03:11:41 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:11:41 localhost podman[74683]: 2026-02-20 08:11:41.264296998 +0000 UTC m=+0.206319964 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:11:41 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:11:45 localhost sshd[74743]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:49 localhost sshd[74745]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:54 localhost sshd[74747]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:58 localhost sshd[74749]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:58 localhost sshd[74751]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:11:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:11:59 localhost systemd[1]: tmp-crun.kwtMEV.mount: Deactivated successfully. Feb 20 03:11:59 localhost podman[74752]: 2026-02-20 08:11:59.15370739 +0000 UTC m=+0.092399109 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 20 03:11:59 localhost podman[74752]: 2026-02-20 08:11:59.373131115 +0000 UTC m=+0.311822804 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20260112.1) Feb 20 03:11:59 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:12:02 localhost sshd[74782]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:12:07 localhost podman[74784]: 2026-02-20 08:12:07.157343387 +0000 UTC m=+0.090371208 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:15Z, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:12:07 localhost podman[74784]: 2026-02-20 08:12:07.17315224 +0000 UTC m=+0.106179971 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:12:07 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:12:07 localhost podman[74805]: 2026-02-20 08:12:07.978539961 +0000 UTC m=+0.082179757 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:12:07 localhost podman[74805]: 2026-02-20 08:12:07.988235267 +0000 UTC m=+0.091875073 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:12:08 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:12:08 localhost podman[74804]: 2026-02-20 08:12:08.036416432 +0000 UTC m=+0.142176122 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:12:08 localhost podman[74804]: 2026-02-20 08:12:08.090096805 +0000 UTC m=+0.195856475 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T23:07:30Z, release=1766032510) Feb 20 03:12:08 localhost podman[74806]: 2026-02-20 08:12:08.099307847 +0000 UTC m=+0.193224205 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:12:08 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:12:08 localhost podman[74806]: 2026-02-20 08:12:08.151202015 +0000 UTC m=+0.245118363 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:12:08 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:12:09 localhost podman[74976]: 2026-02-20 08:12:09.139333009 +0000 UTC m=+0.086484988 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True) Feb 20 03:12:09 localhost podman[74976]: 2026-02-20 08:12:09.281354866 +0000 UTC m=+0.228506885 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, vcs-type=git, architecture=x86_64, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7) Feb 20 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:12:09 localhost podman[75056]: 2026-02-20 08:12:09.772356474 +0000 UTC m=+0.092525883 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 20 03:12:10 localhost podman[75056]: 2026-02-20 08:12:10.146123433 +0000 UTC m=+0.466292812 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:12:10 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:12:10 localhost sshd[75125]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:12:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:12:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:12:12 localhost systemd[1]: tmp-crun.Hh4NOE.mount: Deactivated successfully. Feb 20 03:12:12 localhost podman[75142]: 2026-02-20 08:12:12.150708098 +0000 UTC m=+0.091343707 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:12:12 localhost podman[75142]: 2026-02-20 08:12:12.181997695 +0000 UTC m=+0.122633384 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=) Feb 20 03:12:12 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:12:12 localhost podman[75143]: 2026-02-20 08:12:12.202212194 +0000 UTC m=+0.143238525 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public) Feb 20 03:12:12 localhost podman[75144]: 2026-02-20 08:12:12.248518942 +0000 UTC m=+0.185160698 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 20 03:12:12 localhost podman[75143]: 2026-02-20 08:12:12.253061251 +0000 UTC m=+0.194087572 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:12:12 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:12:12 localhost podman[75144]: 2026-02-20 08:12:12.293950542 +0000 UTC m=+0.230592208 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, release=1766032510, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:12:12 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:12:15 localhost sshd[75208]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:19 localhost sshd[75210]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:24 localhost sshd[75212]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:25 localhost sshd[75214]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:27 localhost sshd[75216]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:12:30 localhost podman[75218]: 2026-02-20 08:12:30.120358624 +0000 UTC m=+0.068507028 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, distribution-scope=public, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr) Feb 20 03:12:30 localhost podman[75218]: 2026-02-20 08:12:30.349458556 +0000 UTC m=+0.297607020 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 20 03:12:30 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:12:32 localhost sshd[75249]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:34 localhost sshd[75251]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:12:38 localhost podman[75253]: 2026-02-20 08:12:38.139021411 +0000 UTC m=+0.080779134 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:12:38 localhost podman[75253]: 2026-02-20 08:12:38.158139196 +0000 UTC m=+0.099896959 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, release=1766032510) Feb 20 03:12:38 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:12:38 localhost podman[75292]: 2026-02-20 08:12:38.292738116 +0000 UTC m=+0.092909075 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:12:38 localhost podman[75254]: 2026-02-20 08:12:38.250899195 +0000 UTC m=+0.188094618 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:12:38 localhost podman[75282]: 2026-02-20 08:12:38.311292883 +0000 UTC m=+0.146443453 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:12:38 localhost podman[75292]: 2026-02-20 08:12:38.322038662 +0000 UTC m=+0.122209611 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:12:38 localhost podman[75254]: 2026-02-20 08:12:38.331030257 +0000 UTC m=+0.268225710 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, release=1766032510, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 20 03:12:38 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:12:38 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:12:38 localhost podman[75282]: 2026-02-20 08:12:38.373174607 +0000 UTC m=+0.208325127 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1) Feb 20 03:12:38 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:12:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:12:41 localhost podman[75345]: 2026-02-20 08:12:41.15172971 +0000 UTC m=+0.088859240 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4) Feb 20 03:12:41 localhost podman[75345]: 2026-02-20 08:12:41.548996109 +0000 UTC m=+0.486125619 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Feb 20 03:12:41 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:12:41 localhost sshd[75368]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:12:43 localhost podman[75370]: 2026-02-20 08:12:43.139084767 +0000 UTC m=+0.076272746 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.13, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc.) Feb 20 03:12:43 localhost podman[75370]: 2026-02-20 08:12:43.166193746 +0000 UTC m=+0.103381715 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:12:43 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:12:43 localhost podman[75372]: 2026-02-20 08:12:43.258819121 +0000 UTC m=+0.185603962 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 20 03:12:43 localhost systemd[1]: tmp-crun.CA4bBr.mount: Deactivated successfully. Feb 20 03:12:43 localhost podman[75371]: 2026-02-20 08:12:43.310239935 +0000 UTC m=+0.241180793 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:12:43 localhost podman[75371]: 2026-02-20 08:12:43.320821299 +0000 UTC m=+0.251762107 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:12:43 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:12:43 localhost podman[75372]: 2026-02-20 08:12:43.361622967 +0000 UTC m=+0.288407808 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:12:43 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:12:49 localhost sshd[75436]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:12:56 localhost sshd[75438]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:13:01 localhost podman[75440]: 2026-02-20 08:13:01.154353108 +0000 UTC m=+0.087194481 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step1, distribution-scope=public) Feb 20 03:13:01 localhost sshd[75470]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:01 localhost podman[75440]: 2026-02-20 08:13:01.316246892 +0000 UTC m=+0.249088265 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:13:01 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:13:05 localhost sshd[75472]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:13:09 localhost systemd[1]: tmp-crun.EWKxml.mount: Deactivated successfully. Feb 20 03:13:09 localhost podman[75474]: 2026-02-20 08:13:09.127552973 +0000 UTC m=+0.068829438 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 20 03:13:09 localhost podman[75476]: 2026-02-20 08:13:09.142873192 +0000 UTC m=+0.075324447 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 20 03:13:09 localhost podman[75474]: 2026-02-20 08:13:09.148930857 +0000 UTC m=+0.090207322 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:13:09 localhost podman[75482]: 2026-02-20 08:13:09.190751307 +0000 UTC m=+0.121026325 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc.) Feb 20 03:13:09 localhost podman[75476]: 2026-02-20 08:13:09.200067473 +0000 UTC m=+0.132518748 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:13:09 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:13:09 localhost podman[75482]: 2026-02-20 08:13:09.218010632 +0000 UTC m=+0.148285660 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 20 03:13:09 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:13:09 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:13:09 localhost podman[75475]: 2026-02-20 08:13:09.171275661 +0000 UTC m=+0.109072069 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Feb 20 03:13:09 localhost podman[75475]: 2026-02-20 08:13:09.304990624 +0000 UTC m=+0.242787042 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.13, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=) Feb 20 03:13:09 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:13:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:13:12 localhost podman[75627]: 2026-02-20 08:13:12.144581775 +0000 UTC m=+0.082170456 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, release=1766032510) Feb 20 03:13:12 localhost sshd[75647]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:12 localhost sshd[75649]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:12 localhost podman[75627]: 2026-02-20 08:13:12.535451668 +0000 UTC m=+0.473040309 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team) Feb 20 03:13:12 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:13:14 localhost systemd[1]: tmp-crun.X29CHf.mount: Deactivated successfully. Feb 20 03:13:14 localhost podman[75668]: 2026-02-20 08:13:14.15245425 +0000 UTC m=+0.096865276 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:13:14 localhost systemd[1]: tmp-crun.4eQ0EK.mount: Deactivated successfully. Feb 20 03:13:14 localhost podman[75669]: 2026-02-20 08:13:14.18805948 +0000 UTC m=+0.132860547 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid) Feb 20 03:13:14 localhost podman[75668]: 2026-02-20 08:13:14.199120848 +0000 UTC m=+0.143531854 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public) Feb 20 03:13:14 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:13:14 localhost podman[75670]: 2026-02-20 08:13:14.245918991 +0000 UTC m=+0.184194539 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:13:14 localhost podman[75669]: 2026-02-20 08:13:14.271918037 +0000 UTC m=+0.216719134 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:13:14 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:13:14 localhost podman[75670]: 2026-02-20 08:13:14.31614356 +0000 UTC m=+0.254419088 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, release=1766032510) Feb 20 03:13:14 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:13:15 localhost sshd[75732]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:16 localhost python3[75781]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:16 localhost python3[75826]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575196.2527015-114047-222902698684906/source _original_basename=tmptlhqj4sy follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:13:17 localhost recover_tripleo_nova_virtqemud[75858]: 63005 Feb 20 03:13:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:13:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:13:17 localhost python3[75856]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:13:19 localhost ansible-async_wrapper.py[76030]: Invoked with 129177955356 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575199.0307937-114221-39317160651077/AnsiballZ_command.py _ Feb 20 03:13:19 localhost ansible-async_wrapper.py[76033]: Starting module and watcher Feb 20 03:13:19 localhost ansible-async_wrapper.py[76033]: Start watching 76034 (3600) Feb 20 03:13:19 localhost ansible-async_wrapper.py[76034]: Start module (76034) Feb 20 03:13:19 localhost ansible-async_wrapper.py[76030]: Return async_wrapper task started. Feb 20 03:13:19 localhost python3[76054]: ansible-ansible.legacy.async_status Invoked with jid=129177955356.76030 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:13:21 localhost sshd[76074]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:23 localhost puppet-user[76051]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 20 03:13:23 localhost puppet-user[76051]: (file: /etc/puppet/hiera.yaml) Feb 20 03:13:23 localhost puppet-user[76051]: Warning: Undefined variable '::deploy_config_name'; Feb 20 03:13:23 localhost puppet-user[76051]: (file & line not available) Feb 20 03:13:23 localhost puppet-user[76051]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 20 03:13:23 localhost puppet-user[76051]: (file & line not available) Feb 20 03:13:23 localhost puppet-user[76051]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 20 03:13:23 localhost puppet-user[76051]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:13:23 localhost puppet-user[76051]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:13:23 localhost puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:13:23 localhost puppet-user[76051]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:13:23 localhost puppet-user[76051]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:13:23 localhost puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:13:23 localhost puppet-user[76051]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:13:23 localhost puppet-user[76051]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:13:23 localhost puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:13:23 localhost puppet-user[76051]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:13:23 localhost puppet-user[76051]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:13:23 localhost puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:13:23 localhost puppet-user[76051]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:13:23 localhost puppet-user[76051]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:13:23 localhost puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 20 03:13:23 localhost puppet-user[76051]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 20 03:13:23 localhost puppet-user[76051]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 20 03:13:23 localhost puppet-user[76051]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 20 03:13:23 localhost puppet-user[76051]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.26 seconds Feb 20 03:13:24 localhost puppet-user[76051]: Notice: Applied catalog in 0.28 seconds Feb 20 03:13:24 localhost puppet-user[76051]: Application: Feb 20 03:13:24 localhost puppet-user[76051]: Initial environment: production Feb 20 03:13:24 localhost puppet-user[76051]: Converged environment: production Feb 20 03:13:24 localhost puppet-user[76051]: Run mode: user Feb 20 03:13:24 localhost puppet-user[76051]: Changes: Feb 20 03:13:24 localhost puppet-user[76051]: Events: Feb 20 03:13:24 localhost puppet-user[76051]: Resources: Feb 20 03:13:24 localhost puppet-user[76051]: Total: 19 Feb 20 03:13:24 localhost puppet-user[76051]: Time: Feb 20 03:13:24 localhost puppet-user[76051]: Schedule: 0.00 Feb 20 03:13:24 localhost puppet-user[76051]: Package: 0.00 Feb 20 03:13:24 localhost puppet-user[76051]: Exec: 0.01 Feb 20 03:13:24 localhost puppet-user[76051]: Augeas: 0.01 Feb 20 03:13:24 localhost puppet-user[76051]: File: 0.03 Feb 20 03:13:24 localhost puppet-user[76051]: Service: 0.08 Feb 20 03:13:24 localhost puppet-user[76051]: Transaction evaluation: 0.27 Feb 20 03:13:24 localhost puppet-user[76051]: Catalog application: 0.28 Feb 20 03:13:24 localhost puppet-user[76051]: Config retrieval: 0.33 Feb 20 03:13:24 localhost puppet-user[76051]: Last run: 1771575204 Feb 20 03:13:24 localhost puppet-user[76051]: Filebucket: 0.00 Feb 20 03:13:24 localhost puppet-user[76051]: Total: 0.28 Feb 20 03:13:24 localhost puppet-user[76051]: Version: Feb 20 03:13:24 localhost puppet-user[76051]: Config: 1771575203 Feb 20 03:13:24 localhost puppet-user[76051]: Puppet: 7.10.0 Feb 20 03:13:24 localhost ansible-async_wrapper.py[76034]: Module complete (76034) Feb 20 03:13:24 localhost ansible-async_wrapper.py[76033]: Done in kid B. Feb 20 03:13:26 localhost sshd[76179]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:28 localhost sshd[76181]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:30 localhost python3[76198]: ansible-ansible.legacy.async_status Invoked with jid=129177955356.76030 mode=status _async_dir=/tmp/.ansible_async Feb 20 03:13:30 localhost sshd[76215]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:30 localhost python3[76214]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 03:13:31 localhost python3[76232]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:13:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:13:31 localhost podman[76283]: 2026-02-20 08:13:31.686492894 +0000 UTC m=+0.091545062 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public) Feb 20 03:13:31 localhost python3[76282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:31 localhost podman[76283]: 2026-02-20 08:13:31.903584869 +0000 UTC m=+0.308637007 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team) Feb 20 03:13:31 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:13:32 localhost python3[76328]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpgolb4ma6 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 20 03:13:32 localhost python3[76358]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:32 localhost sshd[76359]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:34 localhost python3[76465]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 20 03:13:35 localhost python3[76484]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:36 localhost sshd[76517]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:36 localhost python3[76516]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:13:36 localhost python3[76568]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:36 localhost python3[76586]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:37 localhost python3[76648]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:37 localhost python3[76666]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:38 localhost python3[76728]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:38 localhost python3[76746]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:38 localhost sshd[76807]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:39 localhost python3[76809]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:13:39 localhost podman[76828]: 2026-02-20 08:13:39.317242167 +0000 UTC m=+0.072386305 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, vcs-type=git, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 20 03:13:39 localhost podman[76828]: 2026-02-20 08:13:39.326568154 +0000 UTC m=+0.081712312 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13) Feb 20 03:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:13:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:13:39 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:13:39 localhost python3[76829]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:39 localhost systemd[1]: tmp-crun.yZ36nV.mount: Deactivated successfully. Feb 20 03:13:39 localhost podman[76856]: 2026-02-20 08:13:39.424296485 +0000 UTC m=+0.077645088 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510) Feb 20 03:13:39 localhost podman[76849]: 2026-02-20 08:13:39.467845507 +0000 UTC m=+0.130147594 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, architecture=x86_64, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:13:39 localhost podman[76856]: 2026-02-20 08:13:39.488967464 +0000 UTC m=+0.142316017 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=logrotate_crond, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 20 03:13:39 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:13:39 localhost podman[76849]: 2026-02-20 08:13:39.522183381 +0000 UTC m=+0.184485498 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, vcs-type=git) Feb 20 03:13:39 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:13:39 localhost podman[76850]: 2026-02-20 08:13:39.546537366 +0000 UTC m=+0.199448966 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 20 03:13:39 localhost podman[76850]: 2026-02-20 08:13:39.578078161 +0000 UTC m=+0.230989751 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 20 03:13:39 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:13:39 localhost python3[76951]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:13:39 localhost systemd[1]: Reloading. Feb 20 03:13:40 localhost systemd-rc-local-generator[76972]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:13:40 localhost systemd-sysv-generator[76977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:13:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:13:40 localhost systemd[1]: tmp-crun.acIsiJ.mount: Deactivated successfully. Feb 20 03:13:40 localhost python3[77037]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:41 localhost python3[77055]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:41 localhost python3[77117]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 20 03:13:41 localhost sshd[77136]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:41 localhost python3[77135]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:13:42 localhost python3[77166]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:13:42 localhost systemd[1]: Reloading. Feb 20 03:13:42 localhost systemd-rc-local-generator[77188]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:13:42 localhost systemd-sysv-generator[77194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:13:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:13:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:13:42 localhost systemd[1]: Starting Create netns directory... Feb 20 03:13:42 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 03:13:42 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 03:13:42 localhost systemd[1]: Finished Create netns directory. Feb 20 03:13:42 localhost podman[77205]: 2026-02-20 08:13:42.801568862 +0000 UTC m=+0.077774821 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, tcib_managed=true, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public) Feb 20 03:13:43 localhost podman[77205]: 2026-02-20 08:13:43.188446683 +0000 UTC m=+0.464652722 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:13:43 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:13:43 localhost python3[77248]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 20 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:13:45 localhost podman[77291]: 2026-02-20 08:13:45.149848715 +0000 UTC m=+0.085906869 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1) Feb 20 03:13:45 localhost podman[77292]: 2026-02-20 08:13:45.209919314 +0000 UTC m=+0.143482192 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:13:45 localhost podman[77292]: 2026-02-20 08:13:45.218095725 +0000 UTC m=+0.151658603 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:13:45 localhost podman[77291]: 2026-02-20 08:13:45.227059039 +0000 UTC m=+0.163117203 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller) Feb 20 03:13:45 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:13:45 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:13:45 localhost podman[77293]: 2026-02-20 08:13:45.308903524 +0000 UTC m=+0.239784470 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:13:45 localhost podman[77293]: 2026-02-20 08:13:45.359077619 +0000 UTC m=+0.289958605 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:13:45 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:13:45 localhost python3[77355]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 20 03:13:45 localhost sshd[77398]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:45 localhost podman[77410]: 2026-02-20 08:13:45.708846465 +0000 UTC m=+0.089003125 container create a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:13:45 localhost systemd[1]: Started libpod-conmon-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope. Feb 20 03:13:45 localhost podman[77410]: 2026-02-20 08:13:45.664789106 +0000 UTC m=+0.044945796 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:13:45 localhost systemd[1]: Started libcrun container. Feb 20 03:13:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:13:45 localhost podman[77410]: 2026-02-20 08:13:45.820200373 +0000 UTC m=+0.200357063 container init a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:13:45 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:13:45 localhost podman[77410]: 2026-02-20 08:13:45.868554903 +0000 UTC m=+0.248711553 container start a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:13:45 localhost python3[77355]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:13:45 localhost systemd[1]: Created slice User Slice of UID 0. Feb 20 03:13:45 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 20 03:13:45 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 20 03:13:45 localhost systemd[1]: Starting User Manager for UID 0... Feb 20 03:13:45 localhost podman[77432]: 2026-02-20 08:13:45.96223974 +0000 UTC m=+0.087887651 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:13:46 localhost podman[77432]: 2026-02-20 08:13:46.010452406 +0000 UTC m=+0.136100317 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:13:46 localhost podman[77432]: unhealthy Feb 20 03:13:46 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:13:46 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:13:46 localhost systemd[77446]: Queued start job for default target Main User Target. Feb 20 03:13:46 localhost systemd[77446]: Created slice User Application Slice. Feb 20 03:13:46 localhost systemd[77446]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 20 03:13:46 localhost systemd[77446]: Started Daily Cleanup of User's Temporary Directories. Feb 20 03:13:46 localhost systemd[77446]: Reached target Paths. Feb 20 03:13:46 localhost systemd[77446]: Reached target Timers. Feb 20 03:13:46 localhost systemd[77446]: Starting D-Bus User Message Bus Socket... Feb 20 03:13:46 localhost systemd[77446]: Starting Create User's Volatile Files and Directories... Feb 20 03:13:46 localhost systemd[77446]: Listening on D-Bus User Message Bus Socket. Feb 20 03:13:46 localhost systemd[77446]: Reached target Sockets. Feb 20 03:13:46 localhost systemd[77446]: Finished Create User's Volatile Files and Directories. Feb 20 03:13:46 localhost systemd[77446]: Reached target Basic System. Feb 20 03:13:46 localhost systemd[77446]: Reached target Main User Target. Feb 20 03:13:46 localhost systemd[77446]: Startup finished in 139ms. Feb 20 03:13:46 localhost systemd[1]: Started User Manager for UID 0. Feb 20 03:13:46 localhost systemd[1]: Started Session c10 of User root. Feb 20 03:13:46 localhost systemd[1]: session-c10.scope: Deactivated successfully. Feb 20 03:13:46 localhost podman[77535]: 2026-02-20 08:13:46.37880894 +0000 UTC m=+0.085534498 container create 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 20 03:13:46 localhost systemd[1]: Started libpod-conmon-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2.scope. Feb 20 03:13:46 localhost podman[77535]: 2026-02-20 08:13:46.336084703 +0000 UTC m=+0.042810311 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:13:46 localhost systemd[1]: Started libcrun container. Feb 20 03:13:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 20 03:13:46 localhost podman[77535]: 2026-02-20 08:13:46.48631994 +0000 UTC m=+0.193045478 container init 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, distribution-scope=public, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible) Feb 20 03:13:46 localhost podman[77535]: 2026-02-20 08:13:46.498442852 +0000 UTC m=+0.205168380 container start 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 20 03:13:46 localhost podman[77535]: 2026-02-20 08:13:46.498991469 +0000 UTC m=+0.205717047 container attach 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5) Feb 20 03:13:48 localhost sshd[77559]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:50 localhost sshd[77560]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:54 localhost sshd[77562]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:56 localhost systemd[1]: Stopping User Manager for UID 0... Feb 20 03:13:56 localhost systemd[77446]: Activating special unit Exit the Session... Feb 20 03:13:56 localhost systemd[77446]: Stopped target Main User Target. Feb 20 03:13:56 localhost systemd[77446]: Stopped target Basic System. Feb 20 03:13:56 localhost systemd[77446]: Stopped target Paths. Feb 20 03:13:56 localhost systemd[77446]: Stopped target Sockets. Feb 20 03:13:56 localhost systemd[77446]: Stopped target Timers. Feb 20 03:13:56 localhost systemd[77446]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 03:13:56 localhost systemd[77446]: Closed D-Bus User Message Bus Socket. Feb 20 03:13:56 localhost systemd[77446]: Stopped Create User's Volatile Files and Directories. Feb 20 03:13:56 localhost systemd[77446]: Removed slice User Application Slice. Feb 20 03:13:56 localhost systemd[77446]: Reached target Shutdown. Feb 20 03:13:56 localhost systemd[77446]: Finished Exit the Session. Feb 20 03:13:56 localhost systemd[77446]: Reached target Exit the Session. Feb 20 03:13:56 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 20 03:13:56 localhost systemd[1]: Stopped User Manager for UID 0. Feb 20 03:13:56 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 20 03:13:56 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 20 03:13:56 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 20 03:13:56 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 20 03:13:56 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 20 03:13:58 localhost sshd[77565]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:13:59 localhost sshd[77567]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:14:02 localhost podman[77569]: 2026-02-20 08:14:02.16243874 +0000 UTC m=+0.097870806 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64) Feb 20 03:14:02 localhost podman[77569]: 2026-02-20 08:14:02.367561299 +0000 UTC m=+0.302993325 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64) Feb 20 03:14:02 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:14:02 localhost sshd[77598]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:06 localhost sshd[77600]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:14:10 localhost podman[77602]: 2026-02-20 08:14:10.146969004 +0000 UTC m=+0.087020355 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:14:10 localhost podman[77603]: 2026-02-20 08:14:10.200104179 +0000 UTC m=+0.137041224 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:14:10 localhost podman[77603]: 2026-02-20 08:14:10.236147223 +0000 UTC m=+0.173084278 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true) Feb 20 03:14:10 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:14:10 localhost podman[77604]: 2026-02-20 08:14:10.256053622 +0000 UTC m=+0.190985666 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:14:10 localhost podman[77604]: 2026-02-20 08:14:10.269962348 +0000 UTC m=+0.204894392 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:14:10 localhost podman[77605]: 2026-02-20 08:14:10.312848951 +0000 UTC m=+0.242412091 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64) Feb 20 03:14:10 localhost podman[77602]: 2026-02-20 08:14:10.330118729 +0000 UTC m=+0.270170070 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git) Feb 20 03:14:10 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:14:10 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:14:10 localhost podman[77605]: 2026-02-20 08:14:10.372204658 +0000 UTC m=+0.301767788 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Feb 20 03:14:10 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:14:10 localhost sshd[77691]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:11 localhost systemd[1]: tmp-crun.egEuxt.mount: Deactivated successfully. Feb 20 03:14:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:14:14 localhost systemd[1]: tmp-crun.CUanUo.mount: Deactivated successfully. Feb 20 03:14:14 localhost podman[77754]: 2026-02-20 08:14:14.186628325 +0000 UTC m=+0.093408551 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:14:14 localhost podman[77754]: 2026-02-20 08:14:14.59425018 +0000 UTC m=+0.501030396 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc.) Feb 20 03:14:14 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:14:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:14:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:14:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:14:15 localhost systemd[1]: tmp-crun.A4LGtX.mount: Deactivated successfully. Feb 20 03:14:15 localhost podman[77794]: 2026-02-20 08:14:15.481091294 +0000 UTC m=+0.090960476 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc.) Feb 20 03:14:15 localhost podman[77792]: 2026-02-20 08:14:15.444806803 +0000 UTC m=+0.066963660 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 20 03:14:15 localhost podman[77793]: 2026-02-20 08:14:15.516436245 +0000 UTC m=+0.135975212 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=) Feb 20 03:14:15 localhost podman[77793]: 2026-02-20 08:14:15.531999792 +0000 UTC m=+0.151538779 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 20 03:14:15 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:14:15 localhost sshd[77856]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:15 localhost podman[77792]: 2026-02-20 08:14:15.582407575 +0000 UTC m=+0.204564472 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team) Feb 20 03:14:15 localhost podman[77794]: 2026-02-20 08:14:15.619268083 +0000 UTC m=+0.229137275 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:14:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:14:15 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:14:16 localhost podman[77857]: 2026-02-20 08:14:16.132187521 +0000 UTC m=+0.066953450 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.buildah.version=1.41.5, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 20 03:14:16 localhost podman[77857]: 2026-02-20 08:14:16.192170447 +0000 UTC m=+0.126936426 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:14:16 localhost podman[77857]: unhealthy Feb 20 03:14:16 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:14:16 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:14:18 localhost sshd[77880]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:22 localhost sshd[77882]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:27 localhost sshd[77884]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:31 localhost sshd[77886]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:14:33 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:14:33 localhost recover_tripleo_nova_virtqemud[77895]: 63005 Feb 20 03:14:33 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:14:33 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:14:33 localhost podman[77888]: 2026-02-20 08:14:33.163429035 +0000 UTC m=+0.099967811 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z) Feb 20 03:14:33 localhost podman[77888]: 2026-02-20 08:14:33.356936468 +0000 UTC m=+0.293475194 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step1) Feb 20 03:14:33 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:14:39 localhost sshd[77920]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:14:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:14:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:14:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:14:41 localhost systemd[1]: tmp-crun.Vl5qvP.mount: Deactivated successfully. Feb 20 03:14:41 localhost podman[77922]: 2026-02-20 08:14:41.109596254 +0000 UTC m=+0.093350658 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Feb 20 03:14:41 localhost systemd[1]: tmp-crun.pOAWx1.mount: Deactivated successfully. Feb 20 03:14:41 localhost podman[77926]: 2026-02-20 08:14:41.167600139 +0000 UTC m=+0.142150322 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible) Feb 20 03:14:41 localhost podman[77926]: 2026-02-20 08:14:41.197923608 +0000 UTC m=+0.172473771 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 20 03:14:41 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:14:41 localhost podman[77924]: 2026-02-20 08:14:41.211779262 +0000 UTC m=+0.191701219 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, container_name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container) Feb 20 03:14:41 localhost podman[77922]: 2026-02-20 08:14:41.220383075 +0000 UTC m=+0.204137439 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible) Feb 20 03:14:41 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:14:41 localhost podman[77924]: 2026-02-20 08:14:41.275195112 +0000 UTC m=+0.255117079 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:14:41 localhost podman[77923]: 2026-02-20 08:14:41.310902565 +0000 UTC m=+0.290892574 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Feb 20 03:14:41 localhost podman[77923]: 2026-02-20 08:14:41.325153291 +0000 UTC m=+0.305143300 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com) Feb 20 03:14:41 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:14:41 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:14:42 localhost systemd[1]: tmp-crun.ar2SZw.mount: Deactivated successfully. Feb 20 03:14:42 localhost sshd[78013]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:43 localhost sshd[78015]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:14:44 localhost podman[78017]: 2026-02-20 08:14:44.894302152 +0000 UTC m=+0.079726400 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, container_name=nova_migration_target) Feb 20 03:14:45 localhost sshd[78041]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:45 localhost podman[78017]: 2026-02-20 08:14:45.271298901 +0000 UTC m=+0.456723109 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:14:45 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:14:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:14:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:14:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:14:46 localhost podman[78042]: 2026-02-20 08:14:46.142142636 +0000 UTC m=+0.080668651 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:14:46 localhost podman[78042]: 2026-02-20 08:14:46.171113042 +0000 UTC m=+0.109639087 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13) Feb 20 03:14:46 localhost podman[78043]: 2026-02-20 08:14:46.196151188 +0000 UTC m=+0.131772144 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:14:46 localhost podman[78043]: 2026-02-20 08:14:46.235945766 +0000 UTC m=+0.171566672 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, architecture=x86_64, vcs-type=git, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 20 03:14:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:14:46 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:14:46 localhost podman[78044]: 2026-02-20 08:14:46.248916313 +0000 UTC m=+0.181463125 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:14:46 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:14:46 localhost podman[78044]: 2026-02-20 08:14:46.29259616 +0000 UTC m=+0.225142952 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4) Feb 20 03:14:46 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:14:46 localhost podman[78104]: 2026-02-20 08:14:46.340226848 +0000 UTC m=+0.079764602 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 20 03:14:46 localhost podman[78104]: 2026-02-20 08:14:46.417623557 +0000 UTC m=+0.157161261 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:14:46 localhost podman[78104]: unhealthy Feb 20 03:14:46 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:14:46 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:14:46 localhost sshd[78133]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:49 localhost sshd[78135]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:52 localhost sshd[78137]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:14:55 localhost systemd[1]: session-28.scope: Deactivated successfully. Feb 20 03:14:55 localhost systemd[1]: session-28.scope: Consumed 3.011s CPU time. Feb 20 03:14:55 localhost systemd-logind[759]: Session 28 logged out. Waiting for processes to exit. Feb 20 03:14:55 localhost systemd-logind[759]: Removed session 28. Feb 20 03:14:56 localhost sshd[78139]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:00 localhost sshd[78141]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:03 localhost sshd[78143]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:15:04 localhost systemd[1]: tmp-crun.RdiBXl.mount: Deactivated successfully. Feb 20 03:15:04 localhost podman[78144]: 2026-02-20 08:15:04.152202547 +0000 UTC m=+0.087957232 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 20 03:15:04 localhost podman[78144]: 2026-02-20 08:15:04.386312153 +0000 UTC m=+0.322066848 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1) Feb 20 03:15:04 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:15:08 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:15:08 localhost recover_tripleo_nova_virtqemud[78173]: 63005 Feb 20 03:15:08 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:15:08 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:15:10 localhost sshd[78174]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:15:11 localhost podman[78176]: 2026-02-20 08:15:11.401120615 +0000 UTC m=+0.085422554 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 20 03:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:15:11 localhost podman[78177]: 2026-02-20 08:15:11.455015515 +0000 UTC m=+0.134567580 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:15:11 localhost podman[78176]: 2026-02-20 08:15:11.467510048 +0000 UTC m=+0.151811957 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true) Feb 20 03:15:11 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:15:11 localhost podman[78177]: 2026-02-20 08:15:11.484040864 +0000 UTC m=+0.163592989 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:15:11 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:15:11 localhost podman[78208]: 2026-02-20 08:15:11.543207364 +0000 UTC m=+0.122876792 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=logrotate_crond, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:15:11 localhost podman[78208]: 2026-02-20 08:15:11.550763126 +0000 UTC m=+0.130432544 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:15:11 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:15:11 localhost podman[78209]: 2026-02-20 08:15:11.610587777 +0000 UTC m=+0.185984244 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:15:11 localhost podman[78209]: 2026-02-20 08:15:11.626040819 +0000 UTC m=+0.201437346 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5) Feb 20 03:15:11 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:15:11 localhost sshd[78269]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:13 localhost sshd[78271]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:13 localhost sshd[78273]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:15:15 localhost podman[78352]: 2026-02-20 08:15:15.963034382 +0000 UTC m=+0.093032119 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, release=1766032510, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 20 03:15:16 localhost podman[78352]: 2026-02-20 08:15:16.312140456 +0000 UTC m=+0.442138163 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13) Feb 20 03:15:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:15:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:15:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:15:16 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:15:16 localhost systemd[1]: tmp-crun.z9niV7.mount: Deactivated successfully. Feb 20 03:15:16 localhost podman[78375]: 2026-02-20 08:15:16.408611779 +0000 UTC m=+0.062407361 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.buildah.version=1.41.5) Feb 20 03:15:16 localhost podman[78377]: 2026-02-20 08:15:16.426403124 +0000 UTC m=+0.072242183 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, release=1766032510, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 20 03:15:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:15:16 localhost podman[78375]: 2026-02-20 08:15:16.461051074 +0000 UTC m=+0.114846656 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:15:16 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:15:16 localhost podman[78376]: 2026-02-20 08:15:16.462071006 +0000 UTC m=+0.112046311 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, distribution-scope=public, container_name=iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510) Feb 20 03:15:16 localhost podman[78433]: 2026-02-20 08:15:16.526369114 +0000 UTC m=+0.063459884 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:15:16 localhost podman[78376]: 2026-02-20 08:15:16.543306082 +0000 UTC m=+0.193281407 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 20 03:15:16 localhost podman[78377]: 2026-02-20 08:15:16.550704148 +0000 UTC m=+0.196543257 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Feb 20 03:15:16 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:15:16 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:15:16 localhost podman[78433]: 2026-02-20 08:15:16.608889029 +0000 UTC m=+0.145979809 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 20 03:15:16 localhost podman[78433]: unhealthy Feb 20 03:15:16 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:15:16 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:15:18 localhost sshd[78464]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:23 localhost sshd[78466]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:27 localhost sshd[78468]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:30 localhost sshd[78470]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:31 localhost sshd[78472]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:32 localhost sshd[78474]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:33 localhost sshd[78476]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:34 localhost sshd[78478]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:15:35 localhost podman[78479]: 2026-02-20 08:15:35.14229273 +0000 UTC m=+0.079463493 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:15:35 localhost podman[78479]: 2026-02-20 08:15:35.362409687 +0000 UTC m=+0.299580400 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container) Feb 20 03:15:35 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:15:38 localhost sshd[78509]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:15:42 localhost podman[78513]: 2026-02-20 08:15:42.290942879 +0000 UTC m=+0.087756937 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:15:42 localhost podman[78513]: 2026-02-20 08:15:42.32856028 +0000 UTC m=+0.125374298 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1) Feb 20 03:15:42 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:15:42 localhost podman[78514]: 2026-02-20 08:15:42.335922256 +0000 UTC m=+0.129720582 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, release=1766032510, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:15:42 localhost podman[78511]: 2026-02-20 08:15:42.390850877 +0000 UTC m=+0.192090300 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:15:42 localhost podman[78512]: 2026-02-20 08:15:42.450983128 +0000 UTC m=+0.244478054 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 20 03:15:42 localhost podman[78512]: 2026-02-20 08:15:42.462809979 +0000 UTC m=+0.256304905 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:15:42 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:15:42 localhost podman[78511]: 2026-02-20 08:15:42.473344522 +0000 UTC m=+0.274583905 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:30Z) Feb 20 03:15:42 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:15:42 localhost podman[78514]: 2026-02-20 08:15:42.524855868 +0000 UTC m=+0.318654124 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 20 03:15:42 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:15:46 localhost sshd[78607]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:15:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:15:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:15:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:15:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:15:47 localhost podman[78608]: 2026-02-20 08:15:47.139748106 +0000 UTC m=+0.082641221 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5) Feb 20 03:15:47 localhost podman[78608]: 2026-02-20 08:15:47.191942153 +0000 UTC m=+0.134835238 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4) Feb 20 03:15:47 localhost podman[78611]: 2026-02-20 08:15:47.191415147 +0000 UTC m=+0.129332429 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_id=tripleo_step5, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:15:47 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:15:47 localhost podman[78612]: 2026-02-20 08:15:47.239620672 +0000 UTC m=+0.174168491 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 20 03:15:47 localhost podman[78610]: 2026-02-20 08:15:47.286067174 +0000 UTC m=+0.224897985 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:15:47 localhost podman[78610]: 2026-02-20 08:15:47.316058132 +0000 UTC m=+0.254888953 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:15:47 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:15:47 localhost podman[78611]: 2026-02-20 08:15:47.32514536 +0000 UTC m=+0.263062582 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Feb 20 03:15:47 localhost podman[78611]: unhealthy Feb 20 03:15:47 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:15:47 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:15:47 localhost podman[78609]: 2026-02-20 08:15:47.402404205 +0000 UTC m=+0.341922197 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5) Feb 20 03:15:47 localhost podman[78609]: 2026-02-20 08:15:47.408426319 +0000 UTC m=+0.347944341 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:15:47 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:15:47 localhost podman[78612]: 2026-02-20 08:15:47.635897662 +0000 UTC m=+0.570445531 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.buildah.version=1.41.5, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 20 03:15:47 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:15:49 localhost sshd[78716]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:54 localhost sshd[78718]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:15:59 localhost sshd[78720]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:02 localhost sshd[78722]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:04 localhost sshd[78724]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:16:06 localhost systemd[1]: tmp-crun.LCp2Iw.mount: Deactivated successfully. Feb 20 03:16:06 localhost podman[78726]: 2026-02-20 08:16:06.156075058 +0000 UTC m=+0.093552785 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64) Feb 20 03:16:06 localhost podman[78726]: 2026-02-20 08:16:06.36427446 +0000 UTC m=+0.301752187 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:16:06 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:16:07 localhost sshd[78755]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:10 localhost sshd[78757]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:12 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:16:12 localhost recover_tripleo_nova_virtqemud[78760]: 63005 Feb 20 03:16:12 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:16:12 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:16:13 localhost podman[78762]: 2026-02-20 08:16:13.156116045 +0000 UTC m=+0.080651890 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Feb 20 03:16:13 localhost podman[78762]: 2026-02-20 08:16:13.17008699 +0000 UTC m=+0.094622875 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron) Feb 20 03:16:13 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:16:13 localhost podman[78761]: 2026-02-20 08:16:13.207281011 +0000 UTC m=+0.134303262 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 20 03:16:13 localhost podman[78763]: 2026-02-20 08:16:13.265805759 +0000 UTC m=+0.185378794 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5) Feb 20 03:16:13 localhost podman[78761]: 2026-02-20 08:16:13.283273599 +0000 UTC m=+0.210295830 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 20 03:16:13 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:16:13 localhost podman[78763]: 2026-02-20 08:16:13.307397391 +0000 UTC m=+0.226970396 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:16:13 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:16:13 localhost podman[78764]: 2026-02-20 08:16:13.374477969 +0000 UTC m=+0.291635610 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:16:13 localhost podman[78764]: 2026-02-20 08:16:13.40180843 +0000 UTC m=+0.318966101 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Feb 20 03:16:13 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:16:17 localhost sshd[78914]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:16:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:16:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:16:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:16:17 localhost podman[78932]: 2026-02-20 08:16:17.51169016 +0000 UTC m=+0.075136844 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, url=https://www.redhat.com) Feb 20 03:16:17 localhost systemd[1]: tmp-crun.jVbho7.mount: Deactivated successfully. Feb 20 03:16:17 localhost podman[78933]: 2026-02-20 08:16:17.571671333 +0000 UTC m=+0.129489156 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:16:17 localhost podman[78937]: 2026-02-20 08:16:17.613327538 +0000 UTC m=+0.167627964 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public) Feb 20 03:16:17 localhost podman[78937]: 2026-02-20 08:16:17.62130677 +0000 UTC m=+0.175607226 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Feb 20 03:16:17 localhost podman[78932]: 2026-02-20 08:16:17.623869488 +0000 UTC m=+0.187316202 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Feb 20 03:16:17 localhost podman[78933]: 2026-02-20 08:16:17.634019177 +0000 UTC m=+0.191837050 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 20 03:16:17 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:16:17 localhost podman[78933]: unhealthy Feb 20 03:16:17 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:16:17 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:16:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:16:17 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:16:17 localhost podman[79007]: 2026-02-20 08:16:17.731143578 +0000 UTC m=+0.064517901 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 20 03:16:17 localhost podman[78931]: 2026-02-20 08:16:17.709466629 +0000 UTC m=+0.273271844 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Feb 20 03:16:17 localhost podman[78931]: 2026-02-20 08:16:17.789459978 +0000 UTC m=+0.353265163 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Feb 20 03:16:17 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:16:18 localhost podman[79007]: 2026-02-20 08:16:18.143246237 +0000 UTC m=+0.476620560 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5) Feb 20 03:16:18 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:16:19 localhost sshd[79043]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:20 localhost sshd[79045]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:22 localhost sshd[79047]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:26 localhost sshd[79049]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:30 localhost sshd[79051]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:31 localhost sshd[79053]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:35 localhost sshd[79055]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:16:37 localhost podman[79057]: 2026-02-20 08:16:37.146493442 +0000 UTC m=+0.081304221 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13) Feb 20 03:16:37 localhost podman[79057]: 2026-02-20 08:16:37.336961938 +0000 UTC m=+0.271772697 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container) Feb 20 03:16:37 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:16:39 localhost sshd[79088]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:16:43 localhost podman[79091]: 2026-02-20 08:16:43.418041365 +0000 UTC m=+0.087794758 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:16:43 localhost podman[79090]: 2026-02-20 08:16:43.47646835 +0000 UTC m=+0.147456851 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:16:43 localhost podman[79090]: 2026-02-20 08:16:43.483372729 +0000 UTC m=+0.154361230 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 20 03:16:43 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:16:43 localhost podman[79091]: 2026-02-20 08:16:43.504880323 +0000 UTC m=+0.174633706 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:07:30Z, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:16:43 localhost podman[79120]: 2026-02-20 08:16:43.537789473 +0000 UTC m=+0.096857074 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, container_name=ceilometer_agent_compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:16:43 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:16:43 localhost podman[79119]: 2026-02-20 08:16:43.571006222 +0000 UTC m=+0.134753746 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, version=17.1.13, tcib_managed=true) Feb 20 03:16:43 localhost podman[79119]: 2026-02-20 08:16:43.582909624 +0000 UTC m=+0.146657128 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 20 03:16:43 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:16:43 localhost podman[79120]: 2026-02-20 08:16:43.62227932 +0000 UTC m=+0.181346961 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1) Feb 20 03:16:43 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:16:43 localhost sshd[79180]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:44 localhost systemd[1]: tmp-crun.aryN3w.mount: Deactivated successfully. Feb 20 03:16:46 localhost sshd[79182]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:16:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:16:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:16:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:16:48 localhost systemd[1]: tmp-crun.Toh5w9.mount: Deactivated successfully. Feb 20 03:16:48 localhost podman[79189]: 2026-02-20 08:16:48.154566073 +0000 UTC m=+0.081473877 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public) Feb 20 03:16:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:16:48 localhost systemd[1]: tmp-crun.m5FSiX.mount: Deactivated successfully. Feb 20 03:16:48 localhost podman[79187]: 2026-02-20 08:16:48.213306097 +0000 UTC m=+0.146341398 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:16:48 localhost podman[79187]: 2026-02-20 08:16:48.220915318 +0000 UTC m=+0.153950609 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:16:48 localhost podman[79189]: 2026-02-20 08:16:48.229140128 +0000 UTC m=+0.156047942 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20260112.1) Feb 20 03:16:48 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:16:48 localhost podman[79189]: unhealthy Feb 20 03:16:48 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:16:48 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:16:48 localhost podman[79242]: 2026-02-20 08:16:48.299697572 +0000 UTC m=+0.090271084 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.13, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 20 03:16:48 localhost podman[79186]: 2026-02-20 08:16:48.314903693 +0000 UTC m=+0.247696945 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 20 03:16:48 localhost podman[79186]: 2026-02-20 08:16:48.340040367 +0000 UTC m=+0.272833649 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:16:48 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:16:48 localhost podman[79188]: 2026-02-20 08:16:48.412588681 +0000 UTC m=+0.341869437 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z) Feb 20 03:16:48 localhost podman[79188]: 2026-02-20 08:16:48.45697893 +0000 UTC m=+0.386259736 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:16:48 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:16:48 localhost podman[79242]: 2026-02-20 08:16:48.654120939 +0000 UTC m=+0.444694421 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=nova_migration_target, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64) Feb 20 03:16:48 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:16:49 localhost sshd[79336]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:51 localhost sshd[79387]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:55 localhost sshd[79389]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:58 localhost systemd[1]: libpod-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2.scope: Deactivated successfully. Feb 20 03:16:58 localhost podman[77535]: 2026-02-20 08:16:58.603604776 +0000 UTC m=+192.310330374 container died 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, url=https://www.redhat.com, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z) Feb 20 03:16:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2-userdata-shm.mount: Deactivated successfully. Feb 20 03:16:58 localhost systemd[1]: var-lib-containers-storage-overlay-bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c-merged.mount: Deactivated successfully. Feb 20 03:16:58 localhost podman[79392]: 2026-02-20 08:16:58.704850902 +0000 UTC m=+0.085362175 container cleanup 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Feb 20 03:16:58 localhost systemd[1]: libpod-conmon-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2.scope: Deactivated successfully. Feb 20 03:16:58 localhost python3[77355]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 20 03:16:59 localhost python3[79441]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:16:59 localhost sshd[79457]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:16:59 localhost python3[79458]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 20 03:17:00 localhost python3[79520]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575419.6333683-118857-145106558304638/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:17:00 localhost python3[79536]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 03:17:00 localhost systemd[1]: Reloading. Feb 20 03:17:00 localhost systemd-sysv-generator[79564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:17:00 localhost systemd-rc-local-generator[79558]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:17:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:17:01 localhost python3[79588]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:17:01 localhost systemd[1]: Reloading. Feb 20 03:17:01 localhost systemd-rc-local-generator[79613]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:17:01 localhost systemd-sysv-generator[79617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:17:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:17:01 localhost systemd[1]: Starting nova_compute container... Feb 20 03:17:02 localhost tripleo-start-podman-container[79627]: Creating additional drop-in dependency for "nova_compute" (a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380) Feb 20 03:17:02 localhost systemd[1]: Reloading. Feb 20 03:17:02 localhost systemd-sysv-generator[79682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:17:02 localhost systemd-rc-local-generator[79678]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:17:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:17:02 localhost systemd[1]: Started nova_compute container. Feb 20 03:17:02 localhost python3[79723]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:17:03 localhost sshd[79724]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:04 localhost python3[79846]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005625204 step=5 update_config_hash_only=False Feb 20 03:17:04 localhost python3[79862]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 03:17:05 localhost sshd[79879]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:05 localhost python3[79878]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 20 03:17:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:17:07 localhost sshd[79893]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:08 localhost podman[79881]: 2026-02-20 08:17:08.002314603 +0000 UTC m=+0.091513132 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13) Feb 20 03:17:08 localhost podman[79881]: 2026-02-20 08:17:08.157849368 +0000 UTC m=+0.247047887 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 20 03:17:08 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:17:10 localhost sshd[79912]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:13 localhost sshd[79914]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:17:14 localhost podman[79918]: 2026-02-20 08:17:14.156764557 +0000 UTC m=+0.086611792 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, url=https://www.redhat.com) Feb 20 03:17:14 localhost systemd[1]: tmp-crun.beurMQ.mount: Deactivated successfully. Feb 20 03:17:14 localhost podman[79917]: 2026-02-20 08:17:14.209424166 +0000 UTC m=+0.142529080 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, tcib_managed=true, name=rhosp-rhel9/openstack-cron, distribution-scope=public) Feb 20 03:17:14 localhost podman[79917]: 2026-02-20 08:17:14.221229476 +0000 UTC m=+0.154334390 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:17:14 localhost podman[79918]: 2026-02-20 08:17:14.221418731 +0000 UTC m=+0.151265976 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Feb 20 03:17:14 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:17:14 localhost podman[79919]: 2026-02-20 08:17:14.265025786 +0000 UTC m=+0.188720304 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public) Feb 20 03:17:14 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:17:14 localhost podman[79919]: 2026-02-20 08:17:14.2961191 +0000 UTC m=+0.219813558 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:17:14 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:17:14 localhost podman[79916]: 2026-02-20 08:17:14.354110482 +0000 UTC m=+0.287940089 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:30Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:17:14 localhost podman[79916]: 2026-02-20 08:17:14.405157093 +0000 UTC m=+0.338986700 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:17:14 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:17:16 localhost sshd[80010]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:17:19 localhost podman[80090]: 2026-02-20 08:17:19.04252943 +0000 UTC m=+0.083353344 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:17:19 localhost systemd[1]: tmp-crun.CeviBW.mount: Deactivated successfully. Feb 20 03:17:19 localhost podman[80090]: 2026-02-20 08:17:19.083997409 +0000 UTC m=+0.124821313 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510) Feb 20 03:17:19 localhost podman[80092]: 2026-02-20 08:17:19.090356693 +0000 UTC m=+0.122221155 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:17:19 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:17:19 localhost podman[80092]: 2026-02-20 08:17:19.113179856 +0000 UTC m=+0.145044338 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_compute) Feb 20 03:17:19 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:17:19 localhost podman[80103]: 2026-02-20 08:17:19.065841918 +0000 UTC m=+0.092627485 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 20 03:17:19 localhost podman[80091]: 2026-02-20 08:17:19.250360864 +0000 UTC m=+0.285139194 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64) Feb 20 03:17:19 localhost podman[80089]: 2026-02-20 08:17:19.296423852 +0000 UTC m=+0.337526534 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, release=1766032510, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Feb 20 03:17:19 localhost podman[80091]: 2026-02-20 08:17:19.308558042 +0000 UTC m=+0.343336392 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 20 03:17:19 localhost podman[80089]: 2026-02-20 08:17:19.31837663 +0000 UTC m=+0.359479342 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64) Feb 20 03:17:19 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:17:19 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:17:19 localhost podman[80103]: 2026-02-20 08:17:19.437002303 +0000 UTC m=+0.463787890 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:17:19 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:17:21 localhost sshd[80198]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:25 localhost sshd[80200]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:29 localhost sshd[80202]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:33 localhost sshd[80204]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:34 localhost sshd[80206]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:17:34 localhost recover_tripleo_nova_virtqemud[80209]: 63005 Feb 20 03:17:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:17:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:17:34 localhost systemd-logind[759]: New session 34 of user zuul. Feb 20 03:17:34 localhost systemd[1]: Started Session 34 of User zuul. Feb 20 03:17:35 localhost python3[80317]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 03:17:36 localhost sshd[80395]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:17:39 localhost systemd[1]: tmp-crun.qx1phW.mount: Deactivated successfully. Feb 20 03:17:39 localhost podman[80506]: 2026-02-20 08:17:39.032688907 +0000 UTC m=+0.094035557 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:17:39 localhost podman[80506]: 2026-02-20 08:17:39.227363172 +0000 UTC m=+0.288709822 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, vendor=Red Hat, Inc.) Feb 20 03:17:39 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:17:40 localhost sshd[80533]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:42 localhost python3[80611]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Feb 20 03:17:43 localhost sshd[80614]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:17:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:17:44 localhost podman[80616]: 2026-02-20 08:17:44.381965311 +0000 UTC m=+0.061551271 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:17:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:17:44 localhost podman[80617]: 2026-02-20 08:17:44.419917593 +0000 UTC m=+0.093613464 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, container_name=collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:17:44 localhost podman[80617]: 2026-02-20 08:17:44.435191228 +0000 UTC m=+0.108887099 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Feb 20 03:17:44 localhost podman[80616]: 2026-02-20 08:17:44.44745526 +0000 UTC m=+0.127041280 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:17:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:17:44 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:17:44 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:17:44 localhost systemd[1]: tmp-crun.BMeDHx.mount: Deactivated successfully. Feb 20 03:17:44 localhost podman[80664]: 2026-02-20 08:17:44.539387873 +0000 UTC m=+0.080176977 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:17:44 localhost podman[80647]: 2026-02-20 08:17:44.589587078 +0000 UTC m=+0.183598778 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 20 03:17:44 localhost podman[80664]: 2026-02-20 08:17:44.596767917 +0000 UTC m=+0.137557021 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.13, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4) Feb 20 03:17:44 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:17:44 localhost podman[80647]: 2026-02-20 08:17:44.642239458 +0000 UTC m=+0.236251178 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:17:44 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:17:47 localhost python3[80798]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Feb 20 03:17:47 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Feb 20 03:17:47 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Feb 20 03:17:47 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 03:17:47 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 03:17:47 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 03:17:47 localhost sshd[80821]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:17:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:17:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:17:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:17:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:17:50 localhost sshd[80872]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:50 localhost systemd[1]: tmp-crun.e8kkiz.mount: Deactivated successfully. Feb 20 03:17:50 localhost podman[80869]: 2026-02-20 08:17:50.15669532 +0000 UTC m=+0.088640073 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1766032510, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vcs-type=git) Feb 20 03:17:50 localhost podman[80869]: 2026-02-20 08:17:50.167028564 +0000 UTC m=+0.098973327 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z) Feb 20 03:17:50 localhost podman[80868]: 2026-02-20 08:17:50.125857143 +0000 UTC m=+0.061588202 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1) Feb 20 03:17:50 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:17:50 localhost podman[80868]: 2026-02-20 08:17:50.208995549 +0000 UTC m=+0.144726598 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public) Feb 20 03:17:50 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:17:50 localhost podman[80877]: 2026-02-20 08:17:50.257488812 +0000 UTC m=+0.178950568 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:17:50 localhost podman[80870]: 2026-02-20 08:17:50.310285336 +0000 UTC m=+0.238243719 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13) Feb 20 03:17:50 localhost podman[80871]: 2026-02-20 08:17:50.3656968 +0000 UTC m=+0.292337272 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:17:50 localhost podman[80870]: 2026-02-20 08:17:50.381980925 +0000 UTC m=+0.309939308 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 20 03:17:50 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:17:50 localhost podman[80871]: 2026-02-20 08:17:50.403064845 +0000 UTC m=+0.329705317 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 20 03:17:50 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:17:50 localhost podman[80877]: 2026-02-20 08:17:50.649342947 +0000 UTC m=+0.570804743 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Feb 20 03:17:50 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:17:55 localhost sshd[80983]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:58 localhost sshd[80985]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:17:58 localhost sshd[80986]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:03 localhost sshd[80989]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:07 localhost sshd[80991]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:18:10 localhost podman[80993]: 2026-02-20 08:18:10.148203881 +0000 UTC m=+0.086170959 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1) Feb 20 03:18:10 localhost podman[80993]: 2026-02-20 08:18:10.372044741 +0000 UTC m=+0.310011759 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 20 03:18:10 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:18:13 localhost sshd[81023]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:18:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:18:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:18:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:18:14 localhost podman[81028]: 2026-02-20 08:18:14.915006349 +0000 UTC m=+0.078100024 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:18:14 localhost podman[81025]: 2026-02-20 08:18:14.951380244 +0000 UTC m=+0.122888435 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:18:14 localhost podman[81028]: 2026-02-20 08:18:14.95847222 +0000 UTC m=+0.121565895 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com) Feb 20 03:18:14 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:18:15 localhost podman[81025]: 2026-02-20 08:18:15.012117719 +0000 UTC m=+0.183625890 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5) Feb 20 03:18:15 localhost systemd[1]: tmp-crun.Ax18bw.mount: Deactivated successfully. Feb 20 03:18:15 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:18:15 localhost podman[81026]: 2026-02-20 08:18:15.055768725 +0000 UTC m=+0.227340148 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Feb 20 03:18:15 localhost podman[81027]: 2026-02-20 08:18:15.015217543 +0000 UTC m=+0.181482914 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:18:15 localhost podman[81027]: 2026-02-20 08:18:15.098044479 +0000 UTC m=+0.264309880 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd) Feb 20 03:18:15 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:18:15 localhost podman[81026]: 2026-02-20 08:18:15.118992406 +0000 UTC m=+0.290563829 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:18:15 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:18:15 localhost sshd[81118]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:18 localhost sshd[81120]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:18:20 localhost systemd[1]: tmp-crun.7ToJat.mount: Deactivated successfully. Feb 20 03:18:20 localhost podman[81184]: 2026-02-20 08:18:20.321362636 +0000 UTC m=+0.088943153 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git) Feb 20 03:18:20 localhost podman[81184]: 2026-02-20 08:18:20.350976406 +0000 UTC m=+0.118556883 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:18:20 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:18:20 localhost podman[81183]: 2026-02-20 08:18:20.362996411 +0000 UTC m=+0.129778704 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 20 03:18:20 localhost podman[81183]: 2026-02-20 08:18:20.377483021 +0000 UTC m=+0.144265294 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc.) Feb 20 03:18:20 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:18:20 localhost podman[81240]: 2026-02-20 08:18:20.604132977 +0000 UTC m=+0.077868047 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:18:20 localhost podman[81238]: 2026-02-20 08:18:20.657271521 +0000 UTC m=+0.134171077 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:18:20 localhost podman[81240]: 2026-02-20 08:18:20.682008903 +0000 UTC m=+0.155743993 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 20 03:18:20 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:18:20 localhost podman[81238]: 2026-02-20 08:18:20.708102305 +0000 UTC m=+0.185001791 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4) Feb 20 03:18:20 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:18:20 localhost podman[81288]: 2026-02-20 08:18:20.798870202 +0000 UTC m=+0.080258078 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:18:21 localhost podman[81288]: 2026-02-20 08:18:21.174226296 +0000 UTC m=+0.455614212 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:18:21 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:18:21 localhost sshd[81311]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:27 localhost sshd[81313]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:29 localhost sshd[81315]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:32 localhost sshd[81317]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:33 localhost sshd[81318]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:36 localhost sshd[81321]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:39 localhost sshd[81323]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:18:41 localhost podman[81325]: 2026-02-20 08:18:41.149790392 +0000 UTC m=+0.087569511 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:18:41 localhost podman[81325]: 2026-02-20 08:18:41.333317288 +0000 UTC m=+0.271096397 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team) Feb 20 03:18:41 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:18:42 localhost sshd[81354]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:44 localhost sshd[81356]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:18:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:18:45 localhost podman[81358]: 2026-02-20 08:18:45.139799299 +0000 UTC m=+0.067935314 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 20 03:18:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:18:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:18:45 localhost podman[81358]: 2026-02-20 08:18:45.19080283 +0000 UTC m=+0.118938785 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 20 03:18:45 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:18:45 localhost podman[81390]: 2026-02-20 08:18:45.247556254 +0000 UTC m=+0.078881968 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, container_name=logrotate_crond, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z) Feb 20 03:18:45 localhost podman[81357]: 2026-02-20 08:18:45.198226875 +0000 UTC m=+0.130812885 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:18:45 localhost podman[81394]: 2026-02-20 08:18:45.303174623 +0000 UTC m=+0.131268210 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:18:45 localhost podman[81357]: 2026-02-20 08:18:45.331984598 +0000 UTC m=+0.264570598 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public) Feb 20 03:18:45 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:18:45 localhost podman[81394]: 2026-02-20 08:18:45.340469796 +0000 UTC m=+0.168563352 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container) Feb 20 03:18:45 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:18:45 localhost podman[81390]: 2026-02-20 08:18:45.3850424 +0000 UTC m=+0.216368094 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:18:45 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:18:45 localhost sshd[81449]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:46 localhost systemd[1]: session-34.scope: Deactivated successfully. Feb 20 03:18:46 localhost systemd[1]: session-34.scope: Consumed 6.047s CPU time. Feb 20 03:18:46 localhost systemd-logind[759]: Session 34 logged out. Waiting for processes to exit. Feb 20 03:18:46 localhost systemd-logind[759]: Removed session 34. Feb 20 03:18:49 localhost sshd[81451]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:18:50 localhost systemd[1]: tmp-crun.ENK6FA.mount: Deactivated successfully. Feb 20 03:18:50 localhost podman[81473]: 2026-02-20 08:18:50.621240908 +0000 UTC m=+0.086511090 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13) Feb 20 03:18:50 localhost podman[81474]: 2026-02-20 08:18:50.690229424 +0000 UTC m=+0.152874496 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:18:50 localhost podman[81473]: 2026-02-20 08:18:50.698205976 +0000 UTC m=+0.163476148 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-type=git) Feb 20 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:18:50 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:18:50 localhost podman[81474]: 2026-02-20 08:18:50.750090962 +0000 UTC m=+0.212736024 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:18:50 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:18:50 localhost podman[81516]: 2026-02-20 08:18:50.820437179 +0000 UTC m=+0.093599694 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, release=1766032510) Feb 20 03:18:50 localhost podman[81529]: 2026-02-20 08:18:50.859212517 +0000 UTC m=+0.074039070 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:18:50 localhost podman[81516]: 2026-02-20 08:18:50.891486048 +0000 UTC m=+0.164648633 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com) Feb 20 03:18:50 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:18:50 localhost podman[81529]: 2026-02-20 08:18:50.934110863 +0000 UTC m=+0.148937386 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public) Feb 20 03:18:50 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:18:51 localhost sshd[81588]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:18:51 localhost systemd[1]: tmp-crun.I3HnZt.mount: Deactivated successfully. Feb 20 03:18:51 localhost systemd[1]: tmp-crun.eVPD42.mount: Deactivated successfully. Feb 20 03:18:51 localhost podman[81589]: 2026-02-20 08:18:51.649532147 +0000 UTC m=+0.093337386 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4) Feb 20 03:18:52 localhost podman[81589]: 2026-02-20 08:18:52.027104018 +0000 UTC m=+0.470909257 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 20 03:18:52 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:18:53 localhost sshd[81615]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:53 localhost sshd[81617]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:18:59 localhost sshd[81619]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:01 localhost sshd[81621]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:01 localhost systemd-logind[759]: New session 35 of user zuul. Feb 20 03:19:01 localhost systemd[1]: Started Session 35 of User zuul. Feb 20 03:19:01 localhost python3[81640]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 03:19:05 localhost sshd[81642]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:07 localhost sshd[81644]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:19:12 localhost systemd[1]: tmp-crun.gsEFlc.mount: Deactivated successfully. Feb 20 03:19:12 localhost podman[81646]: 2026-02-20 08:19:12.15687201 +0000 UTC m=+0.092026647 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true) Feb 20 03:19:12 localhost podman[81646]: 2026-02-20 08:19:12.398055457 +0000 UTC m=+0.333210104 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container) Feb 20 03:19:12 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:19:14 localhost sshd[81675]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:19:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:19:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:19:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:19:16 localhost systemd[1]: tmp-crun.ZG0z0x.mount: Deactivated successfully. Feb 20 03:19:16 localhost podman[81678]: 2026-02-20 08:19:16.142333279 +0000 UTC m=+0.081713863 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:19:16 localhost podman[81678]: 2026-02-20 08:19:16.152112237 +0000 UTC m=+0.091492821 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z) Feb 20 03:19:16 localhost podman[81680]: 2026-02-20 08:19:16.158448269 +0000 UTC m=+0.094029238 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Feb 20 03:19:16 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:19:16 localhost podman[81680]: 2026-02-20 08:19:16.185069377 +0000 UTC m=+0.120650426 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:19:16 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:19:16 localhost podman[81679]: 2026-02-20 08:19:16.190235034 +0000 UTC m=+0.128330880 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:19:16 localhost podman[81677]: 2026-02-20 08:19:16.248964208 +0000 UTC m=+0.189033933 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com) Feb 20 03:19:16 localhost podman[81677]: 2026-02-20 08:19:16.272026669 +0000 UTC m=+0.212096414 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team) Feb 20 03:19:16 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:19:16 localhost podman[81679]: 2026-02-20 08:19:16.32240867 +0000 UTC m=+0.260504576 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, container_name=collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true) Feb 20 03:19:16 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:19:16 localhost sshd[81768]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:17 localhost systemd[1]: tmp-crun.TlVR7l.mount: Deactivated successfully. Feb 20 03:19:19 localhost sshd[81770]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:19:21 localhost podman[81819]: 2026-02-20 08:19:21.13441151 +0000 UTC m=+0.064096749 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step5, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container) Feb 20 03:19:21 localhost podman[81816]: 2026-02-20 08:19:21.172380863 +0000 UTC m=+0.105526147 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:19:21 localhost podman[81817]: 2026-02-20 08:19:21.247270108 +0000 UTC m=+0.180381981 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, batch=17.1_20260112.1) Feb 20 03:19:21 localhost podman[81817]: 2026-02-20 08:19:21.258932843 +0000 UTC m=+0.192044736 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team) Feb 20 03:19:21 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:19:21 localhost podman[81816]: 2026-02-20 08:19:21.268856594 +0000 UTC m=+0.202001858 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, release=1766032510, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:19:21 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:19:21 localhost podman[81819]: 2026-02-20 08:19:21.309575261 +0000 UTC m=+0.239260430 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:19:21 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:19:21 localhost podman[81818]: 2026-02-20 08:19:21.398217534 +0000 UTC m=+0.328325126 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public) Feb 20 03:19:21 localhost podman[81818]: 2026-02-20 08:19:21.444184241 +0000 UTC m=+0.374291823 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=ovn_metadata_agent, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:19:21 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:19:22 localhost systemd[1]: tmp-crun.DQK7OI.mount: Deactivated successfully. Feb 20 03:19:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:19:22 localhost podman[81922]: 2026-02-20 08:19:22.245402012 +0000 UTC m=+0.087392256 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container) Feb 20 03:19:22 localhost podman[81922]: 2026-02-20 08:19:22.645222018 +0000 UTC m=+0.487212242 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, release=1766032510) Feb 20 03:19:22 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:19:22 localhost sshd[81945]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:26 localhost sshd[81963]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:26 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:19:26 localhost recover_tripleo_nova_virtqemud[81966]: 63005 Feb 20 03:19:26 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:19:26 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:19:27 localhost sshd[81967]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:30 localhost python3[81984]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 20 03:19:31 localhost sshd[81986]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:34 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 03:19:34 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 03:19:34 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 03:19:34 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 03:19:34 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 03:19:34 localhost systemd[1]: run-r20ebb92b4d124708b26a802eb60ecf24.service: Deactivated successfully. Feb 20 03:19:34 localhost systemd[1]: run-r82e63e13d0294acca6ca1ab412f7b379.service: Deactivated successfully. Feb 20 03:19:36 localhost sshd[82139]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:38 localhost sshd[82141]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:19:42 localhost podman[82143]: 2026-02-20 08:19:42.704533519 +0000 UTC m=+0.220701207 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, version=17.1.13, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:19:42 localhost podman[82143]: 2026-02-20 08:19:42.903120552 +0000 UTC m=+0.419288260 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:19:42 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:19:43 localhost sshd[82173]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:19:46 localhost podman[82175]: 2026-02-20 08:19:46.414503178 +0000 UTC m=+0.096390509 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:19:46 localhost podman[82175]: 2026-02-20 08:19:46.449132221 +0000 UTC m=+0.131019552 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, architecture=x86_64) Feb 20 03:19:46 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:19:46 localhost systemd[1]: tmp-crun.9glfMt.mount: Deactivated successfully. Feb 20 03:19:46 localhost podman[82210]: 2026-02-20 08:19:46.551681946 +0000 UTC m=+0.136557469 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 20 03:19:46 localhost podman[82210]: 2026-02-20 08:19:46.562106653 +0000 UTC m=+0.146982186 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z) Feb 20 03:19:46 localhost podman[82176]: 2026-02-20 08:19:46.521531119 +0000 UTC m=+0.199357037 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true) Feb 20 03:19:46 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:19:46 localhost podman[82177]: 2026-02-20 08:19:46.603198361 +0000 UTC m=+0.278092370 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 20 03:19:46 localhost podman[82176]: 2026-02-20 08:19:46.65156036 +0000 UTC m=+0.329386348 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:47Z, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vcs-type=git, distribution-scope=public) Feb 20 03:19:46 localhost podman[82177]: 2026-02-20 08:19:46.660224514 +0000 UTC m=+0.335118523 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:19:46 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:19:46 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:19:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:19:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4463 writes, 20K keys, 4463 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4463 writes, 468 syncs, 9.54 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:19:48 localhost sshd[82267]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:48 localhost sshd[82269]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:19:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:19:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:19:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:19:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 5194 writes, 22K keys, 5194 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5194 writes, 621 syncs, 8.36 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:19:51 localhost podman[82314]: 2026-02-20 08:19:51.448720229 +0000 UTC m=+0.082818056 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, tcib_managed=true) Feb 20 03:19:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:19:51 localhost systemd[1]: tmp-crun.SLE0XG.mount: Deactivated successfully. Feb 20 03:19:51 localhost podman[82314]: 2026-02-20 08:19:51.511112945 +0000 UTC m=+0.145210762 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 20 03:19:51 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:19:51 localhost podman[82315]: 2026-02-20 08:19:51.566849968 +0000 UTC m=+0.196726808 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3) Feb 20 03:19:51 localhost podman[82315]: 2026-02-20 08:19:51.578958506 +0000 UTC m=+0.208835346 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, distribution-scope=public, url=https://www.redhat.com) Feb 20 03:19:51 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:19:51 localhost podman[82359]: 2026-02-20 08:19:51.58897538 +0000 UTC m=+0.096440110 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510) Feb 20 03:19:51 localhost podman[82316]: 2026-02-20 08:19:51.516796448 +0000 UTC m=+0.143289745 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5) Feb 20 03:19:51 localhost podman[82316]: 2026-02-20 08:19:51.648003073 +0000 UTC m=+0.274496330 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:19:51 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:19:51 localhost podman[82359]: 2026-02-20 08:19:51.670232979 +0000 UTC m=+0.177697709 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z) Feb 20 03:19:51 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:19:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:19:53 localhost podman[82406]: 2026-02-20 08:19:53.139785235 +0000 UTC m=+0.077258259 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 20 03:19:53 localhost podman[82406]: 2026-02-20 08:19:53.512995873 +0000 UTC m=+0.450468907 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:19:53 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:19:54 localhost sshd[82429]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:19:58 localhost sshd[82431]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:00 localhost sshd[82433]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:06 localhost sshd[82435]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:07 localhost sshd[82436]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:09 localhost sshd[82439]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:11 localhost sshd[82441]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:12 localhost python3[82458]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 03:20:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:20:13 localhost systemd[1]: tmp-crun.GIIyPv.mount: Deactivated successfully. Feb 20 03:20:13 localhost podman[82461]: 2026-02-20 08:20:13.139961079 +0000 UTC m=+0.080197086 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:20:13 localhost podman[82461]: 2026-02-20 08:20:13.333691765 +0000 UTC m=+0.273927802 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=) Feb 20 03:20:13 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:20:14 localhost sshd[82490]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:15 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 03:20:16 localhost sshd[82616]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:20:17 localhost systemd[1]: tmp-crun.dlhQ31.mount: Deactivated successfully. Feb 20 03:20:17 localhost podman[82620]: 2026-02-20 08:20:17.156765052 +0000 UTC m=+0.092518932 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 20 03:20:17 localhost podman[82621]: 2026-02-20 08:20:17.203255245 +0000 UTC m=+0.136099846 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=logrotate_crond, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z) Feb 20 03:20:17 localhost podman[82620]: 2026-02-20 08:20:17.214793045 +0000 UTC m=+0.150546895 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:20:17 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:20:17 localhost podman[82623]: 2026-02-20 08:20:17.264381362 +0000 UTC m=+0.190880281 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 20 03:20:17 localhost podman[82622]: 2026-02-20 08:20:17.315166505 +0000 UTC m=+0.245081648 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:20:17 localhost podman[82623]: 2026-02-20 08:20:17.324062385 +0000 UTC m=+0.250561244 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, version=17.1.13) Feb 20 03:20:17 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:20:17 localhost podman[82621]: 2026-02-20 08:20:17.34100826 +0000 UTC m=+0.273852911 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:20:17 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:20:17 localhost podman[82622]: 2026-02-20 08:20:17.375485867 +0000 UTC m=+0.305401010 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:20:17 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:20:20 localhost sshd[82765]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:20:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:20:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:20:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:20:22 localhost systemd[1]: tmp-crun.Bbl1Zk.mount: Deactivated successfully. Feb 20 03:20:22 localhost podman[82769]: 2026-02-20 08:20:22.158970812 +0000 UTC m=+0.088951154 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:20:22 localhost systemd[1]: tmp-crun.tEMcsT.mount: Deactivated successfully. Feb 20 03:20:22 localhost podman[82768]: 2026-02-20 08:20:22.206385612 +0000 UTC m=+0.141057907 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, distribution-scope=public, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:20:22 localhost podman[82768]: 2026-02-20 08:20:22.249990977 +0000 UTC m=+0.184663292 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:20:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:20:22 localhost podman[82770]: 2026-02-20 08:20:22.260830126 +0000 UTC m=+0.185687942 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=) Feb 20 03:20:22 localhost podman[82767]: 2026-02-20 08:20:22.310619579 +0000 UTC m=+0.244248701 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com) Feb 20 03:20:22 localhost podman[82770]: 2026-02-20 08:20:22.31624428 +0000 UTC m=+0.241102046 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, version=17.1.13, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:20:22 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:20:22 localhost podman[82767]: 2026-02-20 08:20:22.334625449 +0000 UTC m=+0.268254541 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:20:22 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:20:22 localhost podman[82769]: 2026-02-20 08:20:22.391039942 +0000 UTC m=+0.321020274 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 20 03:20:22 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:20:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:20:24 localhost podman[82859]: 2026-02-20 08:20:24.142350817 +0000 UTC m=+0.081187087 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:20:24 localhost podman[82859]: 2026-02-20 08:20:24.484267865 +0000 UTC m=+0.423104095 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 20 03:20:24 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:20:25 localhost sshd[82946]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:26 localhost sshd[82978]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:30 localhost sshd[83012]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:33 localhost sshd[83014]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:34 localhost sshd[83016]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:37 localhost sshd[83018]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:40 localhost sshd[83020]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:20:44 localhost podman[83022]: 2026-02-20 08:20:44.14751627 +0000 UTC m=+0.083773797 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, container_name=metrics_qdr, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1) Feb 20 03:20:44 localhost podman[83022]: 2026-02-20 08:20:44.317729186 +0000 UTC m=+0.253986783 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 20 03:20:44 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:20:45 localhost sshd[83051]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:20:48 localhost podman[83053]: 2026-02-20 08:20:48.117691805 +0000 UTC m=+0.078617859 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:20:48 localhost podman[83053]: 2026-02-20 08:20:48.170361815 +0000 UTC m=+0.131287849 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public) Feb 20 03:20:48 localhost podman[83055]: 2026-02-20 08:20:48.180022981 +0000 UTC m=+0.134851968 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public) Feb 20 03:20:48 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:20:48 localhost podman[83055]: 2026-02-20 08:20:48.219755734 +0000 UTC m=+0.174584751 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:20:48 localhost podman[83054]: 2026-02-20 08:20:48.230595457 +0000 UTC m=+0.186440915 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:20:48 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:20:48 localhost podman[83056]: 2026-02-20 08:20:48.299027432 +0000 UTC m=+0.251002701 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:20:48 localhost podman[83054]: 2026-02-20 08:20:48.315291473 +0000 UTC m=+0.271136991 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:20:48 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:20:48 localhost podman[83056]: 2026-02-20 08:20:48.335173614 +0000 UTC m=+0.287148853 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:20:48 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:20:51 localhost sshd[83146]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:20:53 localhost podman[83174]: 2026-02-20 08:20:53.166742413 +0000 UTC m=+0.092052772 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:20:53 localhost podman[83172]: 2026-02-20 08:20:53.211931314 +0000 UTC m=+0.145758615 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team) Feb 20 03:20:53 localhost podman[83172]: 2026-02-20 08:20:53.221127026 +0000 UTC m=+0.154954327 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid) Feb 20 03:20:53 localhost podman[83173]: 2026-02-20 08:20:53.257576817 +0000 UTC m=+0.187756156 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 20 03:20:53 localhost podman[83174]: 2026-02-20 08:20:53.271938629 +0000 UTC m=+0.197248928 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Feb 20 03:20:53 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:20:53 localhost podman[83173]: 2026-02-20 08:20:53.308092611 +0000 UTC m=+0.238271900 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 20 03:20:53 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:20:53 localhost podman[83171]: 2026-02-20 08:20:53.318486361 +0000 UTC m=+0.253040055 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:20:53 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:20:53 localhost podman[83171]: 2026-02-20 08:20:53.348450332 +0000 UTC m=+0.283004046 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public) Feb 20 03:20:53 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:20:54 localhost sshd[83282]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:20:54 localhost systemd[1]: tmp-crun.tqPNHc.mount: Deactivated successfully. Feb 20 03:20:54 localhost podman[83284]: 2026-02-20 08:20:54.851603126 +0000 UTC m=+0.085596484 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 20 03:20:55 localhost sshd[83308]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:20:55 localhost podman[83284]: 2026-02-20 08:20:55.220001907 +0000 UTC m=+0.453995275 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:32:04Z, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4) Feb 20 03:20:55 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:20:58 localhost sshd[83311]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:02 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:21:02 localhost recover_tripleo_nova_virtqemud[83314]: 63005 Feb 20 03:21:02 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:21:02 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:21:02 localhost python3[83330]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 03:21:03 localhost sshd[83334]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:05 localhost rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 20 03:21:09 localhost sshd[83519]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:21:14 localhost podman[83521]: 2026-02-20 08:21:14.679318237 +0000 UTC m=+0.074223404 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:21:14 localhost podman[83521]: 2026-02-20 08:21:14.878946717 +0000 UTC m=+0.273851884 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5) Feb 20 03:21:14 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:21:15 localhost sshd[83549]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:21:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:21:18 localhost systemd[1]: tmp-crun.Sw9Hmr.mount: Deactivated successfully. Feb 20 03:21:18 localhost podman[83552]: 2026-02-20 08:21:18.357238163 +0000 UTC m=+0.088386879 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, container_name=collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible) Feb 20 03:21:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:21:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:21:18 localhost podman[83551]: 2026-02-20 08:21:18.403386252 +0000 UTC m=+0.137831530 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, vcs-type=git) Feb 20 03:21:18 localhost podman[83552]: 2026-02-20 08:21:18.425873354 +0000 UTC m=+0.157022100 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, vcs-type=git) Feb 20 03:21:18 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:21:18 localhost podman[83551]: 2026-02-20 08:21:18.457762775 +0000 UTC m=+0.192208153 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, release=1766032510, container_name=ceilometer_agent_ipmi, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:21:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:21:18 localhost podman[83581]: 2026-02-20 08:21:18.520698591 +0000 UTC m=+0.134362574 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, version=17.1.13, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z) Feb 20 03:21:18 localhost podman[83581]: 2026-02-20 08:21:18.531989268 +0000 UTC m=+0.145653291 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:21:18 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:21:18 localhost podman[83582]: 2026-02-20 08:21:18.622707048 +0000 UTC m=+0.232392799 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5) Feb 20 03:21:18 localhost podman[83582]: 2026-02-20 08:21:18.682047094 +0000 UTC m=+0.291732875 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, tcib_managed=true, vcs-type=git) Feb 20 03:21:18 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:21:22 localhost python3[83657]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Feb 20 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:21:23 localhost podman[83659]: 2026-02-20 08:21:23.428599159 +0000 UTC m=+0.084594893 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:21:23 localhost podman[83658]: 2026-02-20 08:21:23.484685834 +0000 UTC m=+0.141101441 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public) Feb 20 03:21:23 localhost podman[83659]: 2026-02-20 08:21:23.484598121 +0000 UTC m=+0.140593885 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, container_name=nova_compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13) Feb 20 03:21:23 localhost podman[83691]: 2026-02-20 08:21:23.543939506 +0000 UTC m=+0.087350448 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:21:23 localhost podman[83658]: 2026-02-20 08:21:23.556984568 +0000 UTC m=+0.213400185 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:21:23 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:21:23 localhost podman[83691]: 2026-02-20 08:21:23.570287426 +0000 UTC m=+0.113698428 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:21:23 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:21:23 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:21:23 localhost podman[83695]: 2026-02-20 08:21:23.558729131 +0000 UTC m=+0.099122090 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510) Feb 20 03:21:23 localhost podman[83695]: 2026-02-20 08:21:23.638915717 +0000 UTC m=+0.179308636 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:21:23 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:21:23 localhost sshd[83747]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:24 localhost systemd[1]: tmp-crun.JPhOOx.mount: Deactivated successfully. Feb 20 03:21:25 localhost sshd[83749]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:21:26 localhost podman[83751]: 2026-02-20 08:21:26.143873745 +0000 UTC m=+0.083070826 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public) Feb 20 03:21:26 localhost podman[83751]: 2026-02-20 08:21:26.508705886 +0000 UTC m=+0.447902927 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:21:26 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:21:27 localhost sshd[83804]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:32 localhost sshd[83854]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:36 localhost sshd[83856]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:39 localhost sshd[83858]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:40 localhost sshd[83860]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:21:45 localhost podman[83862]: 2026-02-20 08:21:45.14758491 +0000 UTC m=+0.082133326 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:21:45 localhost podman[83862]: 2026-02-20 08:21:45.350959935 +0000 UTC m=+0.285508431 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=) Feb 20 03:21:45 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:21:46 localhost sshd[83892]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:21:48 localhost systemd[1]: tmp-crun.CNlYYr.mount: Deactivated successfully. Feb 20 03:21:48 localhost podman[83894]: 2026-02-20 08:21:48.814779086 +0000 UTC m=+0.090401351 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510) Feb 20 03:21:48 localhost systemd[1]: tmp-crun.tlyn5n.mount: Deactivated successfully. Feb 20 03:21:48 localhost podman[83895]: 2026-02-20 08:21:48.831475849 +0000 UTC m=+0.099203552 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:21:48 localhost podman[83895]: 2026-02-20 08:21:48.867066674 +0000 UTC m=+0.134794327 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13) Feb 20 03:21:48 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:21:48 localhost podman[83894]: 2026-02-20 08:21:48.895821808 +0000 UTC m=+0.171444093 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 20 03:21:48 localhost podman[83899]: 2026-02-20 08:21:48.869817959 +0000 UTC m=+0.134404045 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vcs-type=git) Feb 20 03:21:48 localhost podman[83902]: 2026-02-20 08:21:48.928585156 +0000 UTC m=+0.187155717 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:21:48 localhost podman[83899]: 2026-02-20 08:21:48.9550497 +0000 UTC m=+0.219635776 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 20 03:21:48 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:21:48 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:21:48 localhost podman[83902]: 2026-02-20 08:21:48.989157519 +0000 UTC m=+0.247728100 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 20 03:21:49 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:21:49 localhost sshd[83988]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:52 localhost sshd[83991]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:21:54 localhost podman[84037]: 2026-02-20 08:21:54.159310933 +0000 UTC m=+0.094840138 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Feb 20 03:21:54 localhost systemd[1]: tmp-crun.oYl5d4.mount: Deactivated successfully. Feb 20 03:21:54 localhost podman[84038]: 2026-02-20 08:21:54.205832163 +0000 UTC m=+0.140807081 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:21:54 localhost podman[84038]: 2026-02-20 08:21:54.239303532 +0000 UTC m=+0.174278420 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z) Feb 20 03:21:54 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:21:54 localhost podman[84037]: 2026-02-20 08:21:54.259555045 +0000 UTC m=+0.195084190 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:21:54 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:21:54 localhost podman[84039]: 2026-02-20 08:21:54.259162353 +0000 UTC m=+0.190252142 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:21:54 localhost podman[84040]: 2026-02-20 08:21:54.315028662 +0000 UTC m=+0.242195730 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 20 03:21:54 localhost podman[84039]: 2026-02-20 08:21:54.337857314 +0000 UTC m=+0.268947043 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 20 03:21:54 localhost podman[84040]: 2026-02-20 08:21:54.349813202 +0000 UTC m=+0.276980300 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=) Feb 20 03:21:54 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:21:54 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:21:55 localhost sshd[84131]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:21:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:21:57 localhost systemd[1]: tmp-crun.UQfk1g.mount: Deactivated successfully. Feb 20 03:21:57 localhost podman[84133]: 2026-02-20 08:21:57.1477556 +0000 UTC m=+0.086526441 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_migration_target, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:21:57 localhost podman[84133]: 2026-02-20 08:21:57.498060636 +0000 UTC m=+0.436831537 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z) Feb 20 03:21:57 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:22:00 localhost sshd[84155]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:09 localhost sshd[84157]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:22:16 localhost podman[84159]: 2026-02-20 08:22:16.13433978 +0000 UTC m=+0.074807342 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:22:16 localhost sshd[84177]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:16 localhost podman[84159]: 2026-02-20 08:22:16.337014543 +0000 UTC m=+0.277482095 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 20 03:22:16 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:22:19 localhost podman[84192]: 2026-02-20 08:22:19.160956932 +0000 UTC m=+0.091664649 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:22:19 localhost podman[84192]: 2026-02-20 08:22:19.20216717 +0000 UTC m=+0.132874917 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:22:19 localhost podman[84191]: 2026-02-20 08:22:19.212093116 +0000 UTC m=+0.145257039 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 20 03:22:19 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:22:19 localhost podman[84191]: 2026-02-20 08:22:19.270163161 +0000 UTC m=+0.203327084 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:22:19 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:22:19 localhost podman[84193]: 2026-02-20 08:22:19.275551867 +0000 UTC m=+0.201219050 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com) Feb 20 03:22:19 localhost podman[84193]: 2026-02-20 08:22:19.360988955 +0000 UTC m=+0.286656078 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_id=tripleo_step3, tcib_managed=true) Feb 20 03:22:19 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:22:19 localhost podman[84197]: 2026-02-20 08:22:19.327052101 +0000 UTC m=+0.249678590 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13) Feb 20 03:22:19 localhost podman[84197]: 2026-02-20 08:22:19.411142798 +0000 UTC m=+0.333769216 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z) Feb 20 03:22:19 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:22:19 localhost sshd[84284]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:20 localhost sshd[84286]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:22 localhost systemd[1]: session-35.scope: Deactivated successfully. Feb 20 03:22:22 localhost systemd[1]: session-35.scope: Consumed 19.197s CPU time. Feb 20 03:22:22 localhost systemd-logind[759]: Session 35 logged out. Waiting for processes to exit. Feb 20 03:22:22 localhost systemd-logind[759]: Removed session 35. Feb 20 03:22:24 localhost sshd[84288]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:24 localhost sshd[84289]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:22:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:22:25 localhost sshd[84293]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:25 localhost podman[84292]: 2026-02-20 08:22:25.07573754 +0000 UTC m=+0.075802342 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, architecture=x86_64, release=1766032510, batch=17.1_20260112.1) Feb 20 03:22:25 localhost podman[84294]: 2026-02-20 08:22:25.142510454 +0000 UTC m=+0.137449688 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510) Feb 20 03:22:25 localhost podman[84297]: 2026-02-20 08:22:25.121781116 +0000 UTC m=+0.118857346 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute) Feb 20 03:22:25 localhost podman[84291]: 2026-02-20 08:22:25.182683079 +0000 UTC m=+0.186412165 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 20 03:22:25 localhost podman[84297]: 2026-02-20 08:22:25.202020424 +0000 UTC m=+0.199096654 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:22:25 localhost podman[84294]: 2026-02-20 08:22:25.210965839 +0000 UTC m=+0.205905003 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent) Feb 20 03:22:25 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:22:25 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:22:25 localhost podman[84291]: 2026-02-20 08:22:25.241119817 +0000 UTC m=+0.244848943 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510) Feb 20 03:22:25 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:22:25 localhost podman[84292]: 2026-02-20 08:22:25.262327289 +0000 UTC m=+0.262392151 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 20 03:22:25 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:22:26 localhost sshd[84381]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:22:28 localhost podman[84383]: 2026-02-20 08:22:28.140884488 +0000 UTC m=+0.080466226 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, tcib_managed=true) Feb 20 03:22:28 localhost podman[84383]: 2026-02-20 08:22:28.497180897 +0000 UTC m=+0.436762625 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:22:28 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:22:29 localhost systemd[1]: tmp-crun.td5LNP.mount: Deactivated successfully. Feb 20 03:22:29 localhost podman[84508]: 2026-02-20 08:22:29.962030893 +0000 UTC m=+0.093282290 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 03:22:30 localhost podman[84508]: 2026-02-20 08:22:30.069015313 +0000 UTC m=+0.200266670 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.buildah.version=1.42.2, release=1770267347, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 03:22:31 localhost sshd[84632]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:35 localhost sshd[84651]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:43 localhost sshd[84653]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:22:47 localhost systemd[1]: tmp-crun.urLTK2.mount: Deactivated successfully. Feb 20 03:22:47 localhost podman[84655]: 2026-02-20 08:22:47.046905891 +0000 UTC m=+0.078796815 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, version=17.1.13, vendor=Red Hat, Inc., container_name=metrics_qdr, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:22:47 localhost podman[84655]: 2026-02-20 08:22:47.264950727 +0000 UTC m=+0.296841631 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.) Feb 20 03:22:47 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:22:48 localhost sshd[84685]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:22:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:22:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:22:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:22:49 localhost systemd[1]: tmp-crun.YfkTbb.mount: Deactivated successfully. Feb 20 03:22:49 localhost systemd[1]: tmp-crun.oYCDCA.mount: Deactivated successfully. Feb 20 03:22:49 localhost podman[84687]: 2026-02-20 08:22:49.578325562 +0000 UTC m=+0.088720609 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 20 03:22:49 localhost podman[84688]: 2026-02-20 08:22:49.627759963 +0000 UTC m=+0.132952839 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:22:49 localhost podman[84695]: 2026-02-20 08:22:49.644622742 +0000 UTC m=+0.140035538 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, release=1766032510, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:22:49 localhost podman[84689]: 2026-02-20 08:22:49.60554637 +0000 UTC m=+0.105122704 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:22:49 localhost podman[84688]: 2026-02-20 08:22:49.660798289 +0000 UTC m=+0.165991185 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 20 03:22:49 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:22:49 localhost podman[84695]: 2026-02-20 08:22:49.673099737 +0000 UTC m=+0.168512523 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:22:49 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:22:49 localhost podman[84689]: 2026-02-20 08:22:49.69107198 +0000 UTC m=+0.190648314 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:22:49 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:22:49 localhost podman[84687]: 2026-02-20 08:22:49.711107127 +0000 UTC m=+0.221502194 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5) Feb 20 03:22:49 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:22:52 localhost sshd[84783]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:22:56 localhost podman[84830]: 2026-02-20 08:22:56.144925786 +0000 UTC m=+0.081124226 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4) Feb 20 03:22:56 localhost podman[84830]: 2026-02-20 08:22:56.174085933 +0000 UTC m=+0.110284363 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container) Feb 20 03:22:56 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:22:56 localhost systemd[1]: tmp-crun.6qNJm7.mount: Deactivated successfully. Feb 20 03:22:56 localhost podman[84831]: 2026-02-20 08:22:56.256866969 +0000 UTC m=+0.190061426 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, release=1766032510, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:22:56 localhost podman[84831]: 2026-02-20 08:22:56.267052972 +0000 UTC m=+0.200247429 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64) Feb 20 03:22:56 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:22:56 localhost podman[84836]: 2026-02-20 08:22:56.302901705 +0000 UTC m=+0.229556521 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1) Feb 20 03:22:56 localhost podman[84836]: 2026-02-20 08:22:56.349747106 +0000 UTC m=+0.276401942 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:22:56 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:22:56 localhost podman[84832]: 2026-02-20 08:22:56.352647895 +0000 UTC m=+0.286845983 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, release=1766032510) Feb 20 03:22:56 localhost podman[84832]: 2026-02-20 08:22:56.436202364 +0000 UTC m=+0.370400462 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:22:56 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:22:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:22:57 localhost recover_tripleo_nova_virtqemud[84919]: 63005 Feb 20 03:22:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:22:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:22:57 localhost systemd[1]: tmp-crun.6CMRZN.mount: Deactivated successfully. Feb 20 03:22:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:22:59 localhost systemd[1]: tmp-crun.S904Ct.mount: Deactivated successfully. Feb 20 03:22:59 localhost podman[84920]: 2026-02-20 08:22:59.141873116 +0000 UTC m=+0.081366723 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 20 03:22:59 localhost podman[84920]: 2026-02-20 08:22:59.513063653 +0000 UTC m=+0.452557280 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:32:04Z, version=17.1.13) Feb 20 03:22:59 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:23:01 localhost sshd[84943]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:06 localhost sshd[84945]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:09 localhost sshd[84947]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:09 localhost sshd[84949]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:10 localhost sshd[84951]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:10 localhost sshd[84953]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:14 localhost sshd[84955]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:17 localhost sshd[84957]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:23:18 localhost podman[84959]: 2026-02-20 08:23:18.19687902 +0000 UTC m=+0.138761119 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, distribution-scope=public, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:23:18 localhost podman[84959]: 2026-02-20 08:23:18.419994563 +0000 UTC m=+0.361876602 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:23:18 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:23:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:23:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:23:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:23:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:23:20 localhost sshd[84999]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:20 localhost podman[84988]: 2026-02-20 08:23:20.142157663 +0000 UTC m=+0.081311681 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true) Feb 20 03:23:20 localhost systemd[1]: tmp-crun.xewIWF.mount: Deactivated successfully. Feb 20 03:23:20 localhost podman[84989]: 2026-02-20 08:23:20.165572973 +0000 UTC m=+0.095036814 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond) Feb 20 03:23:20 localhost podman[84989]: 2026-02-20 08:23:20.173570799 +0000 UTC m=+0.103034570 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4) Feb 20 03:23:20 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:23:20 localhost podman[84996]: 2026-02-20 08:23:20.222747341 +0000 UTC m=+0.141454471 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z) Feb 20 03:23:20 localhost podman[84996]: 2026-02-20 08:23:20.252075644 +0000 UTC m=+0.170782724 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, version=17.1.13, container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:23:20 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:23:20 localhost podman[84995]: 2026-02-20 08:23:20.275108532 +0000 UTC m=+0.201582031 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, tcib_managed=true, config_id=tripleo_step3) Feb 20 03:23:20 localhost podman[84995]: 2026-02-20 08:23:20.2893521 +0000 UTC m=+0.215825529 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z) Feb 20 03:23:20 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:23:20 localhost podman[84988]: 2026-02-20 08:23:20.326685638 +0000 UTC m=+0.265839636 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:23:20 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:23:22 localhost sshd[85080]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:25 localhost sshd[85082]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:23:26 localhost podman[85085]: 2026-02-20 08:23:26.452333361 +0000 UTC m=+0.068544779 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:23:26 localhost podman[85084]: 2026-02-20 08:23:26.501268827 +0000 UTC m=+0.120240930 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510) Feb 20 03:23:26 localhost podman[85084]: 2026-02-20 08:23:26.549144049 +0000 UTC m=+0.168116152 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ovn_controller, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible) Feb 20 03:23:26 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:23:26 localhost podman[85091]: 2026-02-20 08:23:26.481287041 +0000 UTC m=+0.089178623 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:23:26 localhost podman[85128]: 2026-02-20 08:23:26.550357806 +0000 UTC m=+0.064004010 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Feb 20 03:23:26 localhost podman[85091]: 2026-02-20 08:23:26.615051766 +0000 UTC m=+0.222943378 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:23:26 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:23:26 localhost podman[85128]: 2026-02-20 08:23:26.631254134 +0000 UTC m=+0.144900308 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, release=1766032510, container_name=ovn_metadata_agent, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public) Feb 20 03:23:26 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:23:26 localhost podman[85085]: 2026-02-20 08:23:26.684494832 +0000 UTC m=+0.300706210 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5) Feb 20 03:23:26 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:23:27 localhost sshd[85195]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:23:29 localhost podman[85300]: 2026-02-20 08:23:29.783150371 +0000 UTC m=+0.090131213 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_migration_target, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible) Feb 20 03:23:30 localhost podman[85300]: 2026-02-20 08:23:30.18627346 +0000 UTC m=+0.493254292 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13) Feb 20 03:23:30 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:23:34 localhost systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring. Feb 20 03:23:34 localhost systemd[1]: Created slice User Slice of UID 0. Feb 20 03:23:34 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 20 03:23:34 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 20 03:23:34 localhost systemd[1]: Starting User Manager for UID 0... Feb 20 03:23:34 localhost sshd[85666]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:34 localhost systemd[85653]: Queued start job for default target Main User Target. Feb 20 03:23:34 localhost systemd[85653]: Created slice User Application Slice. Feb 20 03:23:34 localhost systemd[85653]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 20 03:23:34 localhost systemd[85653]: Started Daily Cleanup of User's Temporary Directories. Feb 20 03:23:34 localhost systemd[85653]: Reached target Paths. Feb 20 03:23:34 localhost systemd[85653]: Reached target Timers. Feb 20 03:23:34 localhost systemd[85653]: Starting D-Bus User Message Bus Socket... Feb 20 03:23:34 localhost systemd[85653]: Starting Create User's Volatile Files and Directories... Feb 20 03:23:34 localhost systemd[85653]: Finished Create User's Volatile Files and Directories. Feb 20 03:23:34 localhost systemd[85653]: Listening on D-Bus User Message Bus Socket. Feb 20 03:23:34 localhost systemd[85653]: Reached target Sockets. Feb 20 03:23:34 localhost systemd[85653]: Reached target Basic System. Feb 20 03:23:34 localhost systemd[85653]: Reached target Main User Target. Feb 20 03:23:34 localhost systemd[85653]: Startup finished in 150ms. Feb 20 03:23:34 localhost systemd[1]: Started User Manager for UID 0. Feb 20 03:23:34 localhost systemd[1]: Started Session c11 of User root. Feb 20 03:23:35 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Feb 20 03:23:35 localhost kernel: device tape7aa8e2a-27 entered promiscuous mode Feb 20 03:23:35 localhost NetworkManager[5988]: [1771575815.5975] manager: (tape7aa8e2a-27): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Feb 20 03:23:35 localhost systemd-udevd[85690]: Network interface NamePolicy= disabled on kernel command line. Feb 20 03:23:35 localhost NetworkManager[5988]: [1771575815.6116] device (tape7aa8e2a-27): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 03:23:35 localhost NetworkManager[5988]: [1771575815.6129] device (tape7aa8e2a-27): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Feb 20 03:23:35 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 20 03:23:35 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Feb 20 03:23:35 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Feb 20 03:23:35 localhost systemd-machined[85698]: New machine qemu-1-instance-00000002. Feb 20 03:23:35 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Feb 20 03:23:35 localhost systemd-udevd[85689]: Network interface NamePolicy= disabled on kernel command line. Feb 20 03:23:35 localhost NetworkManager[5988]: [1771575815.8353] manager: (tapde929a91-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Feb 20 03:23:35 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c1: link becomes ready Feb 20 03:23:35 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c0: link becomes ready Feb 20 03:23:35 localhost NetworkManager[5988]: [1771575815.8799] device (tapde929a91-c0): carrier: link connected Feb 20 03:23:36 localhost kernel: device tapde929a91-c0 entered promiscuous mode Feb 20 03:23:37 localhost sshd[85790]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:37 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 20 03:23:37 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 20 03:23:37 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Feb 20 03:23:37 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Feb 20 03:23:38 localhost podman[85835]: 2026-02-20 08:23:38.231703121 +0000 UTC m=+0.094232579 container create 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 20 03:23:38 localhost podman[85835]: 2026-02-20 08:23:38.185350595 +0000 UTC m=+0.047880073 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 20 03:23:38 localhost systemd[1]: Started libpod-conmon-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba.scope. Feb 20 03:23:38 localhost systemd[1]: tmp-crun.XD2Jlr.mount: Deactivated successfully. Feb 20 03:23:38 localhost systemd[1]: Started libcrun container. Feb 20 03:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ea5b54d1da71d972d7e8dd243987640d185da35de896817d599cfae85808380/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 03:23:38 localhost podman[85835]: 2026-02-20 08:23:38.372472681 +0000 UTC m=+0.235002109 container init 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:23:38 localhost podman[85835]: 2026-02-20 08:23:38.378238998 +0000 UTC m=+0.240768426 container start 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:23:38 localhost setroubleshoot[85792]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 4c153363-0b75-4da9-9673-ecc521f0261c Feb 20 03:23:38 localhost setroubleshoot[85792]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Feb 20 03:23:41 localhost sshd[85859]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:45 localhost sshd[85862]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:47 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Feb 20 03:23:48 localhost snmpd[68593]: empty variable list in _query Feb 20 03:23:48 localhost snmpd[68593]: empty variable list in _query Feb 20 03:23:48 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 20 03:23:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:23:48 localhost podman[85864]: 2026-02-20 08:23:48.736396575 +0000 UTC m=+0.070241872 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 20 03:23:48 localhost sshd[85877]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:48 localhost podman[85864]: 2026-02-20 08:23:48.922507909 +0000 UTC m=+0.256353206 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:23:48 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:23:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:23:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:23:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:23:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:23:50 localhost podman[85895]: 2026-02-20 08:23:50.933535124 +0000 UTC m=+0.090393581 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:23:50 localhost podman[85895]: 2026-02-20 08:23:50.969991146 +0000 UTC m=+0.126849573 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:23:50 localhost systemd[1]: tmp-crun.t1t0GO.mount: Deactivated successfully. Feb 20 03:23:50 localhost podman[85897]: 2026-02-20 08:23:50.983798 +0000 UTC m=+0.134807797 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git) Feb 20 03:23:50 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:23:51 localhost podman[85897]: 2026-02-20 08:23:51.023026917 +0000 UTC m=+0.174036694 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:23:51 localhost podman[85896]: 2026-02-20 08:23:51.034714276 +0000 UTC m=+0.186441405 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, url=https://www.redhat.com) Feb 20 03:23:51 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:23:51 localhost podman[85896]: 2026-02-20 08:23:51.04590734 +0000 UTC m=+0.197634489 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:23:51 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:23:51 localhost podman[85898]: 2026-02-20 08:23:51.136037423 +0000 UTC m=+0.283714498 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510) Feb 20 03:23:51 localhost podman[85898]: 2026-02-20 08:23:51.190134007 +0000 UTC m=+0.337811082 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 20 03:23:51 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:23:51 localhost sshd[85988]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:53 localhost sshd[85990]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:53 localhost sshd[85991]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:23:57 localhost podman[86039]: 2026-02-20 08:23:57.07096871 +0000 UTC m=+0.083422267 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, container_name=ovn_controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:23:57 localhost systemd[1]: tmp-crun.wamHle.mount: Deactivated successfully. Feb 20 03:23:57 localhost podman[86040]: 2026-02-20 08:23:57.140031975 +0000 UTC m=+0.149646484 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, release=1766032510, config_id=tripleo_step3) Feb 20 03:23:57 localhost podman[86041]: 2026-02-20 08:23:57.175304139 +0000 UTC m=+0.181965158 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:23:57 localhost podman[86039]: 2026-02-20 08:23:57.194478189 +0000 UTC m=+0.206931716 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:23:57 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:23:57 localhost podman[86040]: 2026-02-20 08:23:57.203093564 +0000 UTC m=+0.212708063 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5) Feb 20 03:23:57 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:23:57 localhost podman[86041]: 2026-02-20 08:23:57.221005465 +0000 UTC m=+0.227666534 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, tcib_managed=true) Feb 20 03:23:57 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:23:57 localhost podman[86042]: 2026-02-20 08:23:57.281230437 +0000 UTC m=+0.285509762 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:23:57 localhost podman[86042]: 2026-02-20 08:23:57.310013903 +0000 UTC m=+0.314293288 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:23:57 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:23:57 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35900 [20/Feb/2026:08:23:56.500] listener listener/metadata 0/0/0/1164/1164 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 20 03:23:57 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35912 [20/Feb/2026:08:23:57.763] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Feb 20 03:23:57 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35924 [20/Feb/2026:08:23:57.817] listener listener/metadata 0/0/0/12/12 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 20 03:23:57 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35938 [20/Feb/2026:08:23:57.867] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Feb 20 03:23:57 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35950 [20/Feb/2026:08:23:57.917] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Feb 20 03:23:57 localhost sshd[86129]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:23:57 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35954 [20/Feb/2026:08:23:57.967] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35958 [20/Feb/2026:08:23:58.018] listener listener/metadata 0/0/0/14/14 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35960 [20/Feb/2026:08:23:58.081] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35966 [20/Feb/2026:08:23:58.168] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35968 [20/Feb/2026:08:23:58.234] listener listener/metadata 0/0/0/12/12 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35970 [20/Feb/2026:08:23:58.292] listener listener/metadata 0/0/0/11/11 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35972 [20/Feb/2026:08:23:58.331] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35978 [20/Feb/2026:08:23:58.378] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35982 [20/Feb/2026:08:23:58.419] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35998 [20/Feb/2026:08:23:58.470] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Feb 20 03:23:58 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:36006 [20/Feb/2026:08:23:58.521] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Feb 20 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:24:00 localhost systemd[1]: tmp-crun.2p1RDW.mount: Deactivated successfully. Feb 20 03:24:00 localhost podman[86131]: 2026-02-20 08:24:00.619902209 +0000 UTC m=+0.092499106 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:24:01 localhost podman[86131]: 2026-02-20 08:24:01.03578668 +0000 UTC m=+0.508383607 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:24:01 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:24:03 localhost sshd[86154]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:24:10 localhost sshd[86156]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:24:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:24:19 localhost podman[86158]: 2026-02-20 08:24:19.115779004 +0000 UTC m=+0.059825260 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, architecture=x86_64, release=1766032510, tcib_managed=true, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 20 03:24:19 localhost podman[86158]: 2026-02-20 08:24:19.305846261 +0000 UTC m=+0.249892447 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:24:19 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:24:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:24:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:24:21 localhost podman[86189]: 2026-02-20 08:24:21.145088801 +0000 UTC m=+0.081220309 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:24:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:24:21 localhost podman[86189]: 2026-02-20 08:24:21.156662358 +0000 UTC m=+0.092793826 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, container_name=collectd, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5) Feb 20 03:24:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:24:21 localhost systemd[1]: tmp-crun.zkodIA.mount: Deactivated successfully. Feb 20 03:24:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:24:21 localhost podman[86188]: 2026-02-20 08:24:21.236902626 +0000 UTC m=+0.174502508 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, version=17.1.13, vcs-type=git, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:24:21 localhost podman[86188]: 2026-02-20 08:24:21.263244736 +0000 UTC m=+0.200844618 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 20 03:24:21 localhost podman[86217]: 2026-02-20 08:24:21.274681898 +0000 UTC m=+0.107353624 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public) Feb 20 03:24:21 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:24:21 localhost podman[86217]: 2026-02-20 08:24:21.31213005 +0000 UTC m=+0.144801796 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:24:21 localhost podman[86232]: 2026-02-20 08:24:21.323733296 +0000 UTC m=+0.088742320 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Feb 20 03:24:21 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:24:21 localhost podman[86232]: 2026-02-20 08:24:21.378359357 +0000 UTC m=+0.143368401 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1) Feb 20 03:24:21 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:24:28 localhost systemd[1]: tmp-crun.CkalX4.mount: Deactivated successfully. Feb 20 03:24:28 localhost podman[86284]: 2026-02-20 08:24:28.16499866 +0000 UTC m=+0.098633364 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:24:28 localhost podman[86283]: 2026-02-20 08:24:28.135044289 +0000 UTC m=+0.076203995 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:24:28 localhost podman[86282]: 2026-02-20 08:24:28.223979184 +0000 UTC m=+0.162504059 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, release=1766032510, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:24:28 localhost podman[86284]: 2026-02-20 08:24:28.250026486 +0000 UTC m=+0.183661210 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 20 03:24:28 localhost podman[86281]: 2026-02-20 08:24:28.19910819 +0000 UTC m=+0.141346619 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1) Feb 20 03:24:28 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:24:28 localhost podman[86283]: 2026-02-20 08:24:28.272198418 +0000 UTC m=+0.213358124 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z) Feb 20 03:24:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:24:28 localhost podman[86282]: 2026-02-20 08:24:28.305556084 +0000 UTC m=+0.244080939 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:24:28 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:24:28 localhost podman[86281]: 2026-02-20 08:24:28.330205652 +0000 UTC m=+0.272444091 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:24:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:24:29 localhost systemd[1]: tmp-crun.XASm87.mount: Deactivated successfully. Feb 20 03:24:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:24:32 localhost systemd[1]: tmp-crun.zuzjNJ.mount: Deactivated successfully. Feb 20 03:24:32 localhost podman[86372]: 2026-02-20 08:24:32.154672645 +0000 UTC m=+0.092154215 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4) Feb 20 03:24:32 localhost podman[86372]: 2026-02-20 08:24:32.563238941 +0000 UTC m=+0.500720511 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:24:32 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:24:35 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:24:35 localhost recover_tripleo_nova_virtqemud[86475]: 63005 Feb 20 03:24:35 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:24:35 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:24:39 localhost sshd[86476]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:24:48 localhost sshd[86479]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:24:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:24:49 localhost podman[86481]: 2026-02-20 08:24:49.909779617 +0000 UTC m=+0.067990652 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:24:50 localhost podman[86481]: 2026-02-20 08:24:50.085732809 +0000 UTC m=+0.243943834 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:24:50 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:24:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:24:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:24:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:24:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:24:52 localhost systemd[1]: tmp-crun.VNwrCD.mount: Deactivated successfully. Feb 20 03:24:52 localhost systemd[1]: tmp-crun.WLQZrS.mount: Deactivated successfully. Feb 20 03:24:52 localhost podman[86511]: 2026-02-20 08:24:52.139582351 +0000 UTC m=+0.082424576 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 20 03:24:52 localhost podman[86520]: 2026-02-20 08:24:52.199704391 +0000 UTC m=+0.129338189 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 20 03:24:52 localhost podman[86511]: 2026-02-20 08:24:52.226089462 +0000 UTC m=+0.168931707 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:24:52 localhost podman[86513]: 2026-02-20 08:24:52.175132435 +0000 UTC m=+0.106672512 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:24:52 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:24:52 localhost podman[86520]: 2026-02-20 08:24:52.245878191 +0000 UTC m=+0.175512009 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible) Feb 20 03:24:52 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:24:52 localhost podman[86512]: 2026-02-20 08:24:52.302278966 +0000 UTC m=+0.238191508 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 20 03:24:52 localhost podman[86512]: 2026-02-20 08:24:52.313997937 +0000 UTC m=+0.249910499 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container) Feb 20 03:24:52 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:24:52 localhost podman[86513]: 2026-02-20 08:24:52.362128897 +0000 UTC m=+0.293668964 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=collectd, vcs-type=git) Feb 20 03:24:52 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:24:59 localhost systemd[1]: tmp-crun.iZyW3n.mount: Deactivated successfully. Feb 20 03:24:59 localhost podman[86648]: 2026-02-20 08:24:59.165145404 +0000 UTC m=+0.098662836 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:24:59 localhost podman[86649]: 2026-02-20 08:24:59.231390972 +0000 UTC m=+0.162357795 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:24:59 localhost podman[86647]: 2026-02-20 08:24:59.140780835 +0000 UTC m=+0.080042273 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:24:59 localhost podman[86648]: 2026-02-20 08:24:59.244461574 +0000 UTC m=+0.177979056 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public) Feb 20 03:24:59 localhost podman[86655]: 2026-02-20 08:24:59.193282059 +0000 UTC m=+0.123795798 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5) Feb 20 03:24:59 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:24:59 localhost podman[86649]: 2026-02-20 08:24:59.274045303 +0000 UTC m=+0.205012086 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 20 03:24:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:24:59 localhost podman[86647]: 2026-02-20 08:24:59.321876794 +0000 UTC m=+0.261138262 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z) Feb 20 03:24:59 localhost podman[86655]: 2026-02-20 08:24:59.329149139 +0000 UTC m=+0.259662888 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:24:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:24:59 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:25:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:25:03 localhost podman[86736]: 2026-02-20 08:25:03.145351957 +0000 UTC m=+0.083750817 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Feb 20 03:25:03 localhost podman[86736]: 2026-02-20 08:25:03.526102369 +0000 UTC m=+0.464501229 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:25:03 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:25:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:25:21 localhost podman[86760]: 2026-02-20 08:25:21.148212389 +0000 UTC m=+0.087755680 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64) Feb 20 03:25:21 localhost podman[86760]: 2026-02-20 08:25:21.31986725 +0000 UTC m=+0.259410551 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 20 03:25:21 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:25:22 localhost sshd[86789]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:25:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:25:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:25:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:25:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:25:23 localhost podman[86791]: 2026-02-20 08:25:23.166068755 +0000 UTC m=+0.095076485 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:25:23 localhost podman[86794]: 2026-02-20 08:25:23.220017885 +0000 UTC m=+0.140326448 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_id=tripleo_step4, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:25:23 localhost podman[86791]: 2026-02-20 08:25:23.246975634 +0000 UTC m=+0.175983324 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T23:07:30Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:25:23 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:25:23 localhost podman[86794]: 2026-02-20 08:25:23.275275674 +0000 UTC m=+0.195584257 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public) Feb 20 03:25:23 localhost podman[86793]: 2026-02-20 08:25:23.185970608 +0000 UTC m=+0.108052765 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd) Feb 20 03:25:23 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:25:23 localhost podman[86792]: 2026-02-20 08:25:23.319930848 +0000 UTC m=+0.245534124 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=logrotate_crond, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, release=1766032510, managed_by=tripleo_ansible) Feb 20 03:25:23 localhost podman[86792]: 2026-02-20 08:25:23.355248524 +0000 UTC m=+0.280851850 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public) Feb 20 03:25:23 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:25:23 localhost podman[86793]: 2026-02-20 08:25:23.372097542 +0000 UTC m=+0.294179669 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible) Feb 20 03:25:23 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:25:24 localhost systemd[1]: tmp-crun.4aadbn.mount: Deactivated successfully. Feb 20 03:25:25 localhost sshd[86885]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:25:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:25:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:25:30 localhost recover_tripleo_nova_virtqemud[86913]: 63005 Feb 20 03:25:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:25:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:25:30 localhost systemd[1]: tmp-crun.f8rzCM.mount: Deactivated successfully. Feb 20 03:25:30 localhost podman[86889]: 2026-02-20 08:25:30.134562153 +0000 UTC m=+0.072491511 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1) Feb 20 03:25:30 localhost systemd[1]: tmp-crun.r3UlS6.mount: Deactivated successfully. Feb 20 03:25:30 localhost podman[86890]: 2026-02-20 08:25:30.156731745 +0000 UTC m=+0.087872244 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, container_name=nova_compute, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container) Feb 20 03:25:30 localhost podman[86889]: 2026-02-20 08:25:30.183018034 +0000 UTC m=+0.120947452 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:25:30 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:25:30 localhost podman[86890]: 2026-02-20 08:25:30.213211532 +0000 UTC m=+0.144351981 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:25:30 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:25:30 localhost podman[86888]: 2026-02-20 08:25:30.300374962 +0000 UTC m=+0.237202126 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, release=1766032510, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:25:30 localhost podman[86888]: 2026-02-20 08:25:30.307887604 +0000 UTC m=+0.244714778 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public) Feb 20 03:25:30 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:25:30 localhost podman[86887]: 2026-02-20 08:25:30.397471779 +0000 UTC m=+0.337666816 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, build-date=2026-01-12T22:36:40Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:25:30 localhost podman[86887]: 2026-02-20 08:25:30.446111926 +0000 UTC m=+0.386307003 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com) Feb 20 03:25:30 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:25:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:25:34 localhost podman[86980]: 2026-02-20 08:25:34.154159118 +0000 UTC m=+0.084382717 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git) Feb 20 03:25:34 localhost podman[86980]: 2026-02-20 08:25:34.525046806 +0000 UTC m=+0.455270385 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:25:34 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:25:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:25:52 localhost podman[87083]: 2026-02-20 08:25:52.137849111 +0000 UTC m=+0.077397192 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, version=17.1.13) Feb 20 03:25:52 localhost podman[87083]: 2026-02-20 08:25:52.296667765 +0000 UTC m=+0.236215816 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public) Feb 20 03:25:52 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:25:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:25:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:25:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:25:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:25:54 localhost systemd[1]: tmp-crun.P4AdEu.mount: Deactivated successfully. Feb 20 03:25:54 localhost podman[87113]: 2026-02-20 08:25:54.149690621 +0000 UTC m=+0.091336111 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public) Feb 20 03:25:54 localhost podman[87113]: 2026-02-20 08:25:54.197558483 +0000 UTC m=+0.139203963 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true) Feb 20 03:25:54 localhost podman[87121]: 2026-02-20 08:25:54.196708237 +0000 UTC m=+0.129277127 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5) Feb 20 03:25:54 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:25:54 localhost podman[87114]: 2026-02-20 08:25:54.243069453 +0000 UTC m=+0.181227985 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Feb 20 03:25:54 localhost podman[87114]: 2026-02-20 08:25:54.257932501 +0000 UTC m=+0.196091043 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git) Feb 20 03:25:54 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:25:54 localhost podman[87115]: 2026-02-20 08:25:54.3021543 +0000 UTC m=+0.236681600 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, release=1766032510) Feb 20 03:25:54 localhost podman[87115]: 2026-02-20 08:25:54.309982081 +0000 UTC m=+0.244509421 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=) Feb 20 03:25:54 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:25:54 localhost podman[87121]: 2026-02-20 08:25:54.327006885 +0000 UTC m=+0.259575775 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:25:54 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:26:00 localhost sshd[87252]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:26:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:26:01 localhost podman[87257]: 2026-02-20 08:26:01.164666538 +0000 UTC m=+0.091743413 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:26:01 localhost systemd[1]: tmp-crun.gPcAlS.mount: Deactivated successfully. Feb 20 03:26:01 localhost podman[87254]: 2026-02-20 08:26:01.216237634 +0000 UTC m=+0.149894571 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:26:01 localhost podman[87257]: 2026-02-20 08:26:01.222214218 +0000 UTC m=+0.149291073 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13) Feb 20 03:26:01 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:26:01 localhost podman[87256]: 2026-02-20 08:26:01.262240069 +0000 UTC m=+0.193018738 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git) Feb 20 03:26:01 localhost podman[87256]: 2026-02-20 08:26:01.303757246 +0000 UTC m=+0.234535905 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:26:01 localhost podman[87255]: 2026-02-20 08:26:01.319753578 +0000 UTC m=+0.248598427 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team) Feb 20 03:26:01 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:26:01 localhost podman[87255]: 2026-02-20 08:26:01.329840228 +0000 UTC m=+0.258685087 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid) Feb 20 03:26:01 localhost podman[87254]: 2026-02-20 08:26:01.344228851 +0000 UTC m=+0.277885788 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13) Feb 20 03:26:01 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:26:01 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:26:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:26:05 localhost podman[87344]: 2026-02-20 08:26:05.142710264 +0000 UTC m=+0.077326509 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 20 03:26:05 localhost podman[87344]: 2026-02-20 08:26:05.5190699 +0000 UTC m=+0.453686225 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5) Feb 20 03:26:05 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:26:11 localhost sshd[87367]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:26:11 localhost sshd[87369]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:26:19 localhost sshd[87371]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:26:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:26:23 localhost systemd[1]: tmp-crun.dPYgr3.mount: Deactivated successfully. Feb 20 03:26:23 localhost podman[87373]: 2026-02-20 08:26:23.142411739 +0000 UTC m=+0.087671077 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 20 03:26:23 localhost podman[87373]: 2026-02-20 08:26:23.356842625 +0000 UTC m=+0.302101953 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:26:23 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:26:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:26:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:26:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:26:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:26:25 localhost systemd[1]: tmp-crun.TGOfTC.mount: Deactivated successfully. Feb 20 03:26:25 localhost podman[87403]: 2026-02-20 08:26:25.183120058 +0000 UTC m=+0.104574828 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 20 03:26:25 localhost podman[87410]: 2026-02-20 08:26:25.203558706 +0000 UTC m=+0.121410295 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:26:25 localhost podman[87404]: 2026-02-20 08:26:25.159352937 +0000 UTC m=+0.081597751 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git) Feb 20 03:26:25 localhost podman[87410]: 2026-02-20 08:26:25.236052786 +0000 UTC m=+0.153904445 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute) Feb 20 03:26:25 localhost podman[87404]: 2026-02-20 08:26:25.244044461 +0000 UTC m=+0.166289225 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z) Feb 20 03:26:25 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:26:25 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:26:25 localhost podman[87402]: 2026-02-20 08:26:25.304104619 +0000 UTC m=+0.232889544 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.13, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:26:25 localhost podman[87403]: 2026-02-20 08:26:25.315978894 +0000 UTC m=+0.237433664 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 20 03:26:25 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:26:25 localhost podman[87402]: 2026-02-20 08:26:25.354998004 +0000 UTC m=+0.283782929 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:26:25 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:26:26 localhost systemd[1]: tmp-crun.3amH1q.mount: Deactivated successfully. Feb 20 03:26:28 localhost sshd[87492]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:26:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:26:32 localhost systemd[1]: tmp-crun.nlNCLm.mount: Deactivated successfully. Feb 20 03:26:32 localhost podman[87496]: 2026-02-20 08:26:32.160770286 +0000 UTC m=+0.087839512 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:26:32 localhost podman[87499]: 2026-02-20 08:26:32.177952926 +0000 UTC m=+0.096993135 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64) Feb 20 03:26:32 localhost podman[87495]: 2026-02-20 08:26:32.20570989 +0000 UTC m=+0.135197181 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, container_name=iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13) Feb 20 03:26:32 localhost podman[87495]: 2026-02-20 08:26:32.216982686 +0000 UTC m=+0.146469957 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1) Feb 20 03:26:32 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:26:32 localhost podman[87496]: 2026-02-20 08:26:32.237417395 +0000 UTC m=+0.164486631 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64) Feb 20 03:26:32 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:26:32 localhost podman[87499]: 2026-02-20 08:26:32.261163124 +0000 UTC m=+0.180203383 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute) Feb 20 03:26:32 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:26:32 localhost podman[87494]: 2026-02-20 08:26:32.303986892 +0000 UTC m=+0.236780014 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:26:32 localhost podman[87494]: 2026-02-20 08:26:32.35202439 +0000 UTC m=+0.284817492 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1) Feb 20 03:26:32 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:26:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:26:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:26:36 localhost recover_tripleo_nova_virtqemud[87585]: 63005 Feb 20 03:26:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:26:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:26:36 localhost systemd[1]: tmp-crun.tD16SX.mount: Deactivated successfully. Feb 20 03:26:36 localhost podman[87583]: 2026-02-20 08:26:36.15039633 +0000 UTC m=+0.088853684 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510) Feb 20 03:26:36 localhost podman[87583]: 2026-02-20 08:26:36.520782813 +0000 UTC m=+0.459240147 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:26:36 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:26:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:26:54 localhost podman[87685]: 2026-02-20 08:26:54.143686378 +0000 UTC m=+0.082230491 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:26:54 localhost podman[87685]: 2026-02-20 08:26:54.321886599 +0000 UTC m=+0.260430682 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:26:54 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:26:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:26:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:26:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:26:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:26:56 localhost podman[87717]: 2026-02-20 08:26:56.152479655 +0000 UTC m=+0.079209388 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.) Feb 20 03:26:56 localhost podman[87717]: 2026-02-20 08:26:56.192043941 +0000 UTC m=+0.118773634 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:26:56 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:26:56 localhost systemd[1]: tmp-crun.Oz76mY.mount: Deactivated successfully. Feb 20 03:26:56 localhost podman[87716]: 2026-02-20 08:26:56.326988892 +0000 UTC m=+0.254911322 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible) Feb 20 03:26:56 localhost podman[87716]: 2026-02-20 08:26:56.334142512 +0000 UTC m=+0.262064932 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:26:56 localhost podman[87715]: 2026-02-20 08:26:56.29183719 +0000 UTC m=+0.223424982 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:26:56 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:26:56 localhost podman[87715]: 2026-02-20 08:26:56.375212685 +0000 UTC m=+0.306800427 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git) Feb 20 03:26:56 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:26:56 localhost podman[87719]: 2026-02-20 08:26:56.376888057 +0000 UTC m=+0.299210574 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510) Feb 20 03:26:56 localhost podman[87719]: 2026-02-20 08:26:56.460382665 +0000 UTC m=+0.382705182 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:26:56 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:26:57 localhost systemd[1]: tmp-crun.EJJfhA.mount: Deactivated successfully. Feb 20 03:26:57 localhost sshd[87828]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:27:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:27:03 localhost podman[87854]: 2026-02-20 08:27:03.145817876 +0000 UTC m=+0.081378804 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, container_name=nova_compute, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Feb 20 03:27:03 localhost podman[87854]: 2026-02-20 08:27:03.181509344 +0000 UTC m=+0.117070332 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:27:03 localhost systemd[1]: tmp-crun.UZ2vVc.mount: Deactivated successfully. Feb 20 03:27:03 localhost podman[87853]: 2026-02-20 08:27:03.197830706 +0000 UTC m=+0.136622763 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z) Feb 20 03:27:03 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:27:03 localhost podman[87851]: 2026-02-20 08:27:03.250146405 +0000 UTC m=+0.190124959 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:27:03 localhost podman[87852]: 2026-02-20 08:27:03.302304659 +0000 UTC m=+0.241511489 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, container_name=iscsid, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:27:03 localhost podman[87853]: 2026-02-20 08:27:03.325709169 +0000 UTC m=+0.264501276 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, container_name=ovn_metadata_agent, architecture=x86_64) Feb 20 03:27:03 localhost podman[87851]: 2026-02-20 08:27:03.326105891 +0000 UTC m=+0.266084425 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, architecture=x86_64) Feb 20 03:27:03 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:27:03 localhost podman[87852]: 2026-02-20 08:27:03.341042891 +0000 UTC m=+0.280249721 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:27:03 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:27:03 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:27:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:27:07 localhost podman[87950]: 2026-02-20 08:27:07.143103195 +0000 UTC m=+0.081856358 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:27:07 localhost podman[87950]: 2026-02-20 08:27:07.511533167 +0000 UTC m=+0.450286340 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:27:07 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:27:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:27:25 localhost systemd[1]: tmp-crun.OOV9RU.mount: Deactivated successfully. Feb 20 03:27:25 localhost podman[87973]: 2026-02-20 08:27:25.163115565 +0000 UTC m=+0.096568231 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:27:25 localhost podman[87973]: 2026-02-20 08:27:25.391617063 +0000 UTC m=+0.325069689 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:14Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:27:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:27:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:27:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:27:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:27:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:27:27 localhost systemd[1]: tmp-crun.y0M8Bn.mount: Deactivated successfully. Feb 20 03:27:27 localhost podman[88003]: 2026-02-20 08:27:27.150795932 +0000 UTC m=+0.088394250 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 20 03:27:27 localhost podman[88005]: 2026-02-20 08:27:27.190858364 +0000 UTC m=+0.121973292 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Feb 20 03:27:27 localhost podman[88003]: 2026-02-20 08:27:27.198922012 +0000 UTC m=+0.136520360 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:27:27 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:27:27 localhost podman[88005]: 2026-02-20 08:27:27.252559662 +0000 UTC m=+0.183674600 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:27:27 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:27:27 localhost podman[88009]: 2026-02-20 08:27:27.304885032 +0000 UTC m=+0.232797082 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:47Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:27:27 localhost podman[88004]: 2026-02-20 08:27:27.259381462 +0000 UTC m=+0.189990455 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., architecture=x86_64) Feb 20 03:27:27 localhost podman[88004]: 2026-02-20 08:27:27.347984187 +0000 UTC m=+0.278593120 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510) Feb 20 03:27:27 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:27:27 localhost podman[88009]: 2026-02-20 08:27:27.402794393 +0000 UTC m=+0.330706483 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:27:27 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:27:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:27:34 localhost podman[88097]: 2026-02-20 08:27:34.143291817 +0000 UTC m=+0.079718982 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:27:34 localhost podman[88097]: 2026-02-20 08:27:34.165987955 +0000 UTC m=+0.102415100 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:27:34 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:27:34 localhost systemd[1]: tmp-crun.tTiZnP.mount: Deactivated successfully. Feb 20 03:27:34 localhost podman[88098]: 2026-02-20 08:27:34.256883042 +0000 UTC m=+0.190474460 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:27:34 localhost podman[88098]: 2026-02-20 08:27:34.289340839 +0000 UTC m=+0.222932197 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, release=1766032510, architecture=x86_64) Feb 20 03:27:34 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:27:34 localhost podman[88100]: 2026-02-20 08:27:34.30560636 +0000 UTC m=+0.234637568 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:27:34 localhost podman[88099]: 2026-02-20 08:27:34.356734402 +0000 UTC m=+0.288593437 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git) Feb 20 03:27:34 localhost podman[88099]: 2026-02-20 08:27:34.391920715 +0000 UTC m=+0.323779660 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:27:34 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:27:34 localhost podman[88100]: 2026-02-20 08:27:34.40966567 +0000 UTC m=+0.338696888 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 20 03:27:34 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:27:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:27:38 localhost podman[88189]: 2026-02-20 08:27:38.148730617 +0000 UTC m=+0.088337239 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:27:38 localhost podman[88189]: 2026-02-20 08:27:38.508206263 +0000 UTC m=+0.447812855 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container) Feb 20 03:27:38 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:27:43 localhost sshd[88290]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:27:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:27:56 localhost podman[88292]: 2026-02-20 08:27:56.15584944 +0000 UTC m=+0.091027732 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:27:56 localhost podman[88292]: 2026-02-20 08:27:56.378060274 +0000 UTC m=+0.313238496 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 20 03:27:56 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:27:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:27:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:27:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:27:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:27:58 localhost podman[88367]: 2026-02-20 08:27:58.159381654 +0000 UTC m=+0.090211816 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:27:58 localhost podman[88370]: 2026-02-20 08:27:58.207016889 +0000 UTC m=+0.129032020 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 20 03:27:58 localhost podman[88367]: 2026-02-20 08:27:58.211962681 +0000 UTC m=+0.142792773 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git) Feb 20 03:27:58 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:27:58 localhost systemd[1]: tmp-crun.V20X3Z.mount: Deactivated successfully. Feb 20 03:27:58 localhost podman[88368]: 2026-02-20 08:27:58.257937645 +0000 UTC m=+0.187483928 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:27:58 localhost podman[88370]: 2026-02-20 08:27:58.267936153 +0000 UTC m=+0.189951244 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:27:58 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:27:58 localhost podman[88368]: 2026-02-20 08:27:58.295125209 +0000 UTC m=+0.224671482 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 20 03:27:58 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:27:58 localhost podman[88369]: 2026-02-20 08:27:58.312324528 +0000 UTC m=+0.236846356 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd) Feb 20 03:27:58 localhost podman[88369]: 2026-02-20 08:27:58.323853983 +0000 UTC m=+0.248375811 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5) Feb 20 03:27:58 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:28:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:28:05 localhost systemd[1]: tmp-crun.ACF79h.mount: Deactivated successfully. Feb 20 03:28:05 localhost podman[88459]: 2026-02-20 08:28:05.146838934 +0000 UTC m=+0.086140540 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:28:05 localhost podman[88459]: 2026-02-20 08:28:05.150700584 +0000 UTC m=+0.090002180 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:28:05 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:28:05 localhost podman[88460]: 2026-02-20 08:28:05.195943045 +0000 UTC m=+0.133886529 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 20 03:28:05 localhost podman[88458]: 2026-02-20 08:28:05.242578159 +0000 UTC m=+0.183362091 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, build-date=2026-01-12T22:36:40Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, version=17.1.13) Feb 20 03:28:05 localhost podman[88461]: 2026-02-20 08:28:05.163070174 +0000 UTC m=+0.101063850 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step5, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Feb 20 03:28:05 localhost podman[88458]: 2026-02-20 08:28:05.295068673 +0000 UTC m=+0.235852625 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, version=17.1.13, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=ovn_controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git) Feb 20 03:28:05 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:28:05 localhost podman[88460]: 2026-02-20 08:28:05.31640942 +0000 UTC m=+0.254352964 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1766032510, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:28:05 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:28:05 localhost podman[88461]: 2026-02-20 08:28:05.346035531 +0000 UTC m=+0.284029237 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:28:05 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:28:06 localhost systemd[1]: tmp-crun.SmFgiz.mount: Deactivated successfully. Feb 20 03:28:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:28:09 localhost systemd[1]: tmp-crun.aWNM79.mount: Deactivated successfully. Feb 20 03:28:09 localhost podman[88551]: 2026-02-20 08:28:09.139889133 +0000 UTC m=+0.082284432 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:28:09 localhost podman[88551]: 2026-02-20 08:28:09.506097456 +0000 UTC m=+0.448492745 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4) Feb 20 03:28:09 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:28:14 localhost sshd[88575]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:28:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:28:27 localhost systemd[1]: tmp-crun.0b9wfA.mount: Deactivated successfully. Feb 20 03:28:27 localhost podman[88577]: 2026-02-20 08:28:27.147569923 +0000 UTC m=+0.086204022 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, container_name=metrics_qdr, distribution-scope=public, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:28:27 localhost podman[88577]: 2026-02-20 08:28:27.346347667 +0000 UTC m=+0.284981756 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:28:27 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:28:28 localhost sshd[88606]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:28:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:28:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:28:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:28:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:28:29 localhost systemd[1]: tmp-crun.RBtN3v.mount: Deactivated successfully. Feb 20 03:28:29 localhost podman[88608]: 2026-02-20 08:28:29.028791975 +0000 UTC m=+0.080786806 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 20 03:28:29 localhost podman[88616]: 2026-02-20 08:28:29.078179895 +0000 UTC m=+0.118124665 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, container_name=ceilometer_agent_compute) Feb 20 03:28:29 localhost podman[88609]: 2026-02-20 08:28:29.086404058 +0000 UTC m=+0.129061741 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron) Feb 20 03:28:29 localhost podman[88616]: 2026-02-20 08:28:29.101243274 +0000 UTC m=+0.141188064 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1) Feb 20 03:28:29 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:28:29 localhost podman[88608]: 2026-02-20 08:28:29.11702965 +0000 UTC m=+0.169024481 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:28:29 localhost podman[88609]: 2026-02-20 08:28:29.121420324 +0000 UTC m=+0.164078017 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 20 03:28:29 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:28:29 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:28:29 localhost podman[88615]: 2026-02-20 08:28:29.054449414 +0000 UTC m=+0.095644143 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, distribution-scope=public, container_name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:28:29 localhost podman[88615]: 2026-02-20 08:28:29.183211535 +0000 UTC m=+0.224406264 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 20 03:28:29 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:28:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:28:29 localhost sshd[88697]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:28:29 localhost recover_tripleo_nova_virtqemud[88698]: 63005 Feb 20 03:28:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:28:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:28:30 localhost systemd[1]: tmp-crun.rywsdY.mount: Deactivated successfully. Feb 20 03:28:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:28:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:28:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:28:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:28:36 localhost systemd[85653]: Created slice User Background Tasks Slice. Feb 20 03:28:36 localhost systemd[85653]: Starting Cleanup of User's Temporary Files and Directories... Feb 20 03:28:36 localhost podman[88701]: 2026-02-20 08:28:36.173654807 +0000 UTC m=+0.099648356 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc.) Feb 20 03:28:36 localhost systemd[85653]: Finished Cleanup of User's Temporary Files and Directories. Feb 20 03:28:36 localhost podman[88701]: 2026-02-20 08:28:36.189119152 +0000 UTC m=+0.115112711 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:28:36 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:28:36 localhost podman[88702]: 2026-02-20 08:28:36.268310459 +0000 UTC m=+0.188295023 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4) Feb 20 03:28:36 localhost systemd[1]: tmp-crun.G0pL8Z.mount: Deactivated successfully. Feb 20 03:28:36 localhost podman[88703]: 2026-02-20 08:28:36.325704444 +0000 UTC m=+0.245388119 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team) Feb 20 03:28:36 localhost podman[88700]: 2026-02-20 08:28:36.364098895 +0000 UTC m=+0.293311793 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1766032510, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:28:36 localhost podman[88703]: 2026-02-20 08:28:36.383064227 +0000 UTC m=+0.302747893 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true) Feb 20 03:28:36 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:28:36 localhost podman[88702]: 2026-02-20 08:28:36.394013334 +0000 UTC m=+0.313997868 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 20 03:28:36 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:28:36 localhost podman[88700]: 2026-02-20 08:28:36.437820552 +0000 UTC m=+0.367033490 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:28:36 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:28:40 localhost podman[88806]: 2026-02-20 08:28:40.060088526 +0000 UTC m=+0.096639173 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container) Feb 20 03:28:40 localhost podman[88806]: 2026-02-20 08:28:40.444483859 +0000 UTC m=+0.481034576 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:28:40 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:28:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:28:58 localhost podman[88936]: 2026-02-20 08:28:58.126544703 +0000 UTC m=+0.063476734 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z) Feb 20 03:28:58 localhost podman[88936]: 2026-02-20 08:28:58.340141943 +0000 UTC m=+0.277073974 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64) Feb 20 03:28:58 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:29:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:29:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:29:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:29:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:29:00 localhost systemd[1]: tmp-crun.B21YD0.mount: Deactivated successfully. Feb 20 03:29:00 localhost podman[88967]: 2026-02-20 08:29:00.157859862 +0000 UTC m=+0.089471273 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 20 03:29:00 localhost podman[88967]: 2026-02-20 08:29:00.17144704 +0000 UTC m=+0.103058451 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true) Feb 20 03:29:00 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:29:00 localhost podman[88973]: 2026-02-20 08:29:00.260760357 +0000 UTC m=+0.187147207 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 20 03:29:00 localhost podman[88965]: 2026-02-20 08:29:00.310656502 +0000 UTC m=+0.248685440 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible) Feb 20 03:29:00 localhost podman[88973]: 2026-02-20 08:29:00.314427628 +0000 UTC m=+0.240814448 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:29:00 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:29:00 localhost podman[88966]: 2026-02-20 08:29:00.359501604 +0000 UTC m=+0.293629942 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:29:00 localhost podman[88965]: 2026-02-20 08:29:00.365939103 +0000 UTC m=+0.303968051 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:29:00 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:29:00 localhost podman[88966]: 2026-02-20 08:29:00.420445459 +0000 UTC m=+0.354573797 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.13, com.redhat.component=openstack-cron-container, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 20 03:29:00 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:29:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:29:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:29:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:29:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:29:07 localhost systemd[1]: tmp-crun.L3zaiK.mount: Deactivated successfully. Feb 20 03:29:07 localhost podman[89056]: 2026-02-20 08:29:07.15304747 +0000 UTC m=+0.090747261 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:29:07 localhost podman[89056]: 2026-02-20 08:29:07.235941561 +0000 UTC m=+0.173641342 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, architecture=x86_64) Feb 20 03:29:07 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:29:07 localhost podman[89058]: 2026-02-20 08:29:07.247236818 +0000 UTC m=+0.135628822 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 20 03:29:07 localhost podman[89057]: 2026-02-20 08:29:07.217534384 +0000 UTC m=+0.152748339 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:29:07 localhost podman[89059]: 2026-02-20 08:29:07.187927224 +0000 UTC m=+0.113636457 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:29:07 localhost podman[89057]: 2026-02-20 08:29:07.29769688 +0000 UTC m=+0.232910835 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:29:07 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:29:07 localhost podman[89059]: 2026-02-20 08:29:07.322161273 +0000 UTC m=+0.247870526 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 20 03:29:07 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:29:07 localhost podman[89058]: 2026-02-20 08:29:07.354712653 +0000 UTC m=+0.243104637 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:29:07 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:29:08 localhost systemd[1]: tmp-crun.cuT8lv.mount: Deactivated successfully. Feb 20 03:29:09 localhost sshd[89147]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:29:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:29:10 localhost podman[89149]: 2026-02-20 08:29:10.594059802 +0000 UTC m=+0.083669825 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:29:11 localhost podman[89149]: 2026-02-20 08:29:11.038155377 +0000 UTC m=+0.527765370 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:29:11 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:29:14 localhost sshd[89172]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:29:27 localhost sshd[89174]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:29:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:29:29 localhost systemd[1]: tmp-crun.Yjsavt.mount: Deactivated successfully. Feb 20 03:29:29 localhost podman[89176]: 2026-02-20 08:29:29.162954571 +0000 UTC m=+0.100988589 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510) Feb 20 03:29:29 localhost podman[89176]: 2026-02-20 08:29:29.355922548 +0000 UTC m=+0.293956486 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:29:29 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:29:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:29:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:29:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:29:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:29:31 localhost podman[89208]: 2026-02-20 08:29:31.144167674 +0000 UTC m=+0.078125635 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:29:31 localhost podman[89206]: 2026-02-20 08:29:31.153514643 +0000 UTC m=+0.088632919 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, container_name=ceilometer_agent_ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 20 03:29:31 localhost podman[89207]: 2026-02-20 08:29:31.208818264 +0000 UTC m=+0.142572969 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5) Feb 20 03:29:31 localhost podman[89206]: 2026-02-20 08:29:31.220008788 +0000 UTC m=+0.155127074 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:29:31 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:29:31 localhost podman[89208]: 2026-02-20 08:29:31.234158973 +0000 UTC m=+0.168116934 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, release=1766032510, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:29:31 localhost podman[89207]: 2026-02-20 08:29:31.245866814 +0000 UTC m=+0.179621579 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 20 03:29:31 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:29:31 localhost podman[89209]: 2026-02-20 08:29:31.222825935 +0000 UTC m=+0.151209464 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 20 03:29:31 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:29:31 localhost podman[89209]: 2026-02-20 08:29:31.308027747 +0000 UTC m=+0.236411286 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5) Feb 20 03:29:31 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:29:32 localhost systemd[1]: tmp-crun.jOJB1X.mount: Deactivated successfully. Feb 20 03:29:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:29:36 localhost recover_tripleo_nova_virtqemud[89298]: 63005 Feb 20 03:29:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:29:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:29:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:29:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:29:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:29:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:29:38 localhost systemd[1]: tmp-crun.0tu98f.mount: Deactivated successfully. Feb 20 03:29:38 localhost podman[89299]: 2026-02-20 08:29:38.161569548 +0000 UTC m=+0.092244080 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:29:38 localhost podman[89301]: 2026-02-20 08:29:38.199965189 +0000 UTC m=+0.121877571 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=) Feb 20 03:29:38 localhost podman[89299]: 2026-02-20 08:29:38.207969026 +0000 UTC m=+0.138643558 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1) Feb 20 03:29:38 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:29:38 localhost podman[89301]: 2026-02-20 08:29:38.267533668 +0000 UTC m=+0.189446040 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4) Feb 20 03:29:38 localhost podman[89303]: 2026-02-20 08:29:38.275727481 +0000 UTC m=+0.197819919 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:29:38 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:29:38 localhost podman[89300]: 2026-02-20 08:29:38.326919146 +0000 UTC m=+0.250120668 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public) Feb 20 03:29:38 localhost podman[89300]: 2026-02-20 08:29:38.360118887 +0000 UTC m=+0.283320369 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510) Feb 20 03:29:38 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:29:38 localhost podman[89303]: 2026-02-20 08:29:38.380806744 +0000 UTC m=+0.302899182 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:29:38 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:29:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:29:41 localhost systemd[1]: tmp-crun.bt2Z3k.mount: Deactivated successfully. Feb 20 03:29:41 localhost podman[89408]: 2026-02-20 08:29:41.696198152 +0000 UTC m=+0.088862656 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:29:42 localhost podman[89408]: 2026-02-20 08:29:42.072290144 +0000 UTC m=+0.464954638 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, release=1766032510, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:29:42 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:29:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:29:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 476 writes, 1864 keys, 476 commit groups, 1.0 writes per commit group, ingest: 2.57 MB, 0.00 MB/s#012Interval WAL: 476 writes, 169 syncs, 2.82 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:29:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:29:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 522 writes, 1999 keys, 522 commit groups, 1.0 writes per commit group, ingest: 2.25 MB, 0.00 MB/s#012Interval WAL: 522 writes, 182 syncs, 2.87 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:29:53 localhost sshd[89492]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:29:58 localhost sshd[89539]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:30:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:30:00 localhost podman[89541]: 2026-02-20 08:30:00.149213697 +0000 UTC m=+0.087754891 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:30:00 localhost podman[89541]: 2026-02-20 08:30:00.36802533 +0000 UTC m=+0.306566474 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:30:00 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:30:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:30:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:30:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:30:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:30:02 localhost systemd[1]: tmp-crun.Fb208A.mount: Deactivated successfully. Feb 20 03:30:02 localhost podman[89571]: 2026-02-20 08:30:02.203847611 +0000 UTC m=+0.143543649 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:30:02 localhost podman[89573]: 2026-02-20 08:30:02.212436816 +0000 UTC m=+0.143301173 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5) Feb 20 03:30:02 localhost podman[89572]: 2026-02-20 08:30:02.251832008 +0000 UTC m=+0.187668547 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:30:02 localhost podman[89571]: 2026-02-20 08:30:02.265336153 +0000 UTC m=+0.205032191 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 20 03:30:02 localhost podman[89570]: 2026-02-20 08:30:02.170536326 +0000 UTC m=+0.110959996 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.) Feb 20 03:30:02 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:30:02 localhost podman[89572]: 2026-02-20 08:30:02.293092807 +0000 UTC m=+0.228929326 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1766032510, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:30:02 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:30:02 localhost podman[89570]: 2026-02-20 08:30:02.305449518 +0000 UTC m=+0.245873168 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git) Feb 20 03:30:02 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:30:02 localhost podman[89573]: 2026-02-20 08:30:02.31951224 +0000 UTC m=+0.250376607 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:30:02 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:30:03 localhost systemd[1]: tmp-crun.wtxT2t.mount: Deactivated successfully. Feb 20 03:30:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:30:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:30:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:30:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:30:09 localhost systemd[1]: tmp-crun.vCYBVE.mount: Deactivated successfully. Feb 20 03:30:09 localhost podman[89661]: 2026-02-20 08:30:09.203118234 +0000 UTC m=+0.139622418 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:30:09 localhost podman[89661]: 2026-02-20 08:30:09.224072768 +0000 UTC m=+0.160576952 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:30:09 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:30:09 localhost podman[89663]: 2026-02-20 08:30:09.267820955 +0000 UTC m=+0.194340561 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:30:09 localhost podman[89662]: 2026-02-20 08:30:09.184052977 +0000 UTC m=+0.115551116 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, container_name=iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:30:09 localhost podman[89669]: 2026-02-20 08:30:09.312953803 +0000 UTC m=+0.236759955 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:30:09 localhost podman[89669]: 2026-02-20 08:30:09.342884845 +0000 UTC m=+0.266690997 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:30:09 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:30:09 localhost podman[89662]: 2026-02-20 08:30:09.36680797 +0000 UTC m=+0.298306129 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team) Feb 20 03:30:09 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:30:09 localhost podman[89663]: 2026-02-20 08:30:09.393352028 +0000 UTC m=+0.319871634 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:30:09 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:30:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:30:13 localhost podman[89757]: 2026-02-20 08:30:13.145306798 +0000 UTC m=+0.085296015 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target) Feb 20 03:30:13 localhost podman[89757]: 2026-02-20 08:30:13.509386331 +0000 UTC m=+0.449375638 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.buildah.version=1.41.5, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:30:13 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:30:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:30:31 localhost podman[89780]: 2026-02-20 08:30:31.127527396 +0000 UTC m=+0.068014784 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 20 03:30:31 localhost podman[89780]: 2026-02-20 08:30:31.319966837 +0000 UTC m=+0.260454285 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:30:31 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:30:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:30:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:30:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:30:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:30:33 localhost podman[89811]: 2026-02-20 08:30:33.134158592 +0000 UTC m=+0.068224371 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git) Feb 20 03:30:33 localhost podman[89811]: 2026-02-20 08:30:33.142892571 +0000 UTC m=+0.076958390 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, architecture=x86_64) Feb 20 03:30:33 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:30:33 localhost podman[89809]: 2026-02-20 08:30:33.156158018 +0000 UTC m=+0.093886909 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:30:33 localhost podman[89809]: 2026-02-20 08:30:33.207173499 +0000 UTC m=+0.144902360 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:30:33 localhost podman[89810]: 2026-02-20 08:30:33.20952931 +0000 UTC m=+0.142741483 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:30:33 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:30:33 localhost podman[89810]: 2026-02-20 08:30:33.295409123 +0000 UTC m=+0.228621246 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, container_name=logrotate_crond, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 20 03:30:33 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:30:33 localhost podman[89815]: 2026-02-20 08:30:33.266892976 +0000 UTC m=+0.195173587 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 20 03:30:33 localhost podman[89815]: 2026-02-20 08:30:33.349939462 +0000 UTC m=+0.278220143 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute) Feb 20 03:30:33 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:30:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:30:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:30:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:30:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:30:40 localhost podman[89903]: 2026-02-20 08:30:40.157045391 +0000 UTC m=+0.086225884 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:30:40 localhost podman[89903]: 2026-02-20 08:30:40.169007359 +0000 UTC m=+0.098187912 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5) Feb 20 03:30:40 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:30:40 localhost systemd[1]: tmp-crun.MoTSp5.mount: Deactivated successfully. Feb 20 03:30:40 localhost podman[89905]: 2026-02-20 08:30:40.226006873 +0000 UTC m=+0.147767188 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 20 03:30:40 localhost podman[89905]: 2026-02-20 08:30:40.252982523 +0000 UTC m=+0.174742898 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true) Feb 20 03:30:40 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:30:40 localhost podman[89902]: 2026-02-20 08:30:40.301463145 +0000 UTC m=+0.232458714 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, container_name=ovn_controller, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:30:40 localhost podman[89902]: 2026-02-20 08:30:40.352094293 +0000 UTC m=+0.283089872 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com) Feb 20 03:30:40 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:30:40 localhost podman[89904]: 2026-02-20 08:30:40.362912296 +0000 UTC m=+0.289124618 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4) Feb 20 03:30:40 localhost podman[89904]: 2026-02-20 08:30:40.406976292 +0000 UTC m=+0.333188604 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:30:40 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:30:43 localhost sshd[89996]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:30:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:30:44 localhost systemd[1]: tmp-crun.Tr43b0.mount: Deactivated successfully. Feb 20 03:30:44 localhost podman[89998]: 2026-02-20 08:30:44.150698999 +0000 UTC m=+0.090137104 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:30:44 localhost podman[89998]: 2026-02-20 08:30:44.511861853 +0000 UTC m=+0.451299998 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container) Feb 20 03:30:44 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:30:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:30:56 localhost recover_tripleo_nova_virtqemud[90152]: 63005 Feb 20 03:30:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:30:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:31:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:31:02 localhost sshd[90210]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:31:02 localhost systemd[1]: tmp-crun.4j52RB.mount: Deactivated successfully. Feb 20 03:31:02 localhost podman[90198]: 2026-02-20 08:31:02.153022177 +0000 UTC m=+0.090037461 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team) Feb 20 03:31:02 localhost podman[90198]: 2026-02-20 08:31:02.360993407 +0000 UTC m=+0.298008641 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:31:02 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:31:04 localhost systemd[1]: tmp-crun.PZpOP7.mount: Deactivated successfully. Feb 20 03:31:04 localhost systemd[1]: tmp-crun.4gQc0J.mount: Deactivated successfully. Feb 20 03:31:04 localhost podman[90241]: 2026-02-20 08:31:04.195209168 +0000 UTC m=+0.120997075 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:31:04 localhost podman[90229]: 2026-02-20 08:31:04.149284064 +0000 UTC m=+0.087769041 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible) Feb 20 03:31:04 localhost podman[90228]: 2026-02-20 08:31:04.215891965 +0000 UTC m=+0.153359421 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:31:04 localhost podman[90229]: 2026-02-20 08:31:04.231990749 +0000 UTC m=+0.170475686 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=logrotate_crond) Feb 20 03:31:04 localhost podman[90228]: 2026-02-20 08:31:04.241834983 +0000 UTC m=+0.179302389 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 20 03:31:04 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:31:04 localhost podman[90241]: 2026-02-20 08:31:04.242373169 +0000 UTC m=+0.168161066 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 20 03:31:04 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:31:04 localhost podman[90230]: 2026-02-20 08:31:04.183967353 +0000 UTC m=+0.115073614 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 20 03:31:04 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:31:04 localhost podman[90230]: 2026-02-20 08:31:04.313517017 +0000 UTC m=+0.244623328 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3) Feb 20 03:31:04 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:31:10 localhost sshd[90323]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:31:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:31:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:31:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:31:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:31:11 localhost systemd[1]: tmp-crun.utdfF3.mount: Deactivated successfully. Feb 20 03:31:11 localhost podman[90328]: 2026-02-20 08:31:11.169718777 +0000 UTC m=+0.100732691 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute) Feb 20 03:31:11 localhost podman[90326]: 2026-02-20 08:31:11.211027798 +0000 UTC m=+0.147277483 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1766032510, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 20 03:31:11 localhost podman[90326]: 2026-02-20 08:31:11.219846139 +0000 UTC m=+0.156095844 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:31:11 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:31:11 localhost podman[90328]: 2026-02-20 08:31:11.305203635 +0000 UTC m=+0.236217489 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com) Feb 20 03:31:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:31:11 localhost podman[90325]: 2026-02-20 08:31:11.35735175 +0000 UTC m=+0.295116231 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1) Feb 20 03:31:11 localhost podman[90327]: 2026-02-20 08:31:11.408134613 +0000 UTC m=+0.338769515 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:31:11 localhost podman[90325]: 2026-02-20 08:31:11.437144955 +0000 UTC m=+0.374909396 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64) Feb 20 03:31:11 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:31:11 localhost podman[90327]: 2026-02-20 08:31:11.483330747 +0000 UTC m=+0.413965659 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 20 03:31:11 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:31:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:31:15 localhost podman[90414]: 2026-02-20 08:31:15.141962958 +0000 UTC m=+0.077731243 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5) Feb 20 03:31:15 localhost podman[90414]: 2026-02-20 08:31:15.488022956 +0000 UTC m=+0.423791211 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target) Feb 20 03:31:15 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:31:25 localhost sshd[90437]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:31:27 localhost sshd[90439]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:31:28 localhost sshd[90441]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:31:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:31:33 localhost systemd[1]: tmp-crun.Z6uuEL.mount: Deactivated successfully. Feb 20 03:31:33 localhost podman[90443]: 2026-02-20 08:31:33.144305097 +0000 UTC m=+0.083823820 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 20 03:31:33 localhost podman[90443]: 2026-02-20 08:31:33.363014607 +0000 UTC m=+0.302533260 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13) Feb 20 03:31:33 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:31:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:31:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:31:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:31:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:31:35 localhost podman[90473]: 2026-02-20 08:31:35.129321118 +0000 UTC m=+0.070012525 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:31:35 localhost podman[90474]: 2026-02-20 08:31:35.152046577 +0000 UTC m=+0.086350078 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond) Feb 20 03:31:35 localhost podman[90475]: 2026-02-20 08:31:35.210794545 +0000 UTC m=+0.141573278 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Feb 20 03:31:35 localhost podman[90473]: 2026-02-20 08:31:35.258817873 +0000 UTC m=+0.199509290 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:31:35 localhost podman[90481]: 2026-02-20 08:31:35.268322295 +0000 UTC m=+0.195005981 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 20 03:31:35 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:31:35 localhost podman[90474]: 2026-02-20 08:31:35.290390044 +0000 UTC m=+0.224693575 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:15Z, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:31:35 localhost podman[90475]: 2026-02-20 08:31:35.298213945 +0000 UTC m=+0.228992718 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, container_name=collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Feb 20 03:31:35 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:31:35 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:31:35 localhost podman[90481]: 2026-02-20 08:31:35.34742996 +0000 UTC m=+0.274113686 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:31:35 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:31:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:31:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:31:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:31:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:31:42 localhost systemd[1]: tmp-crun.EoiROg.mount: Deactivated successfully. Feb 20 03:31:42 localhost podman[90566]: 2026-02-20 08:31:42.510090198 +0000 UTC m=+0.091398683 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:31:42 localhost podman[90564]: 2026-02-20 08:31:42.5725371 +0000 UTC m=+0.160173660 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:31:42 localhost podman[90564]: 2026-02-20 08:31:42.596164267 +0000 UTC m=+0.183800847 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:31:42 localhost podman[90567]: 2026-02-20 08:31:42.602858193 +0000 UTC m=+0.181289299 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:31:42 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:31:42 localhost podman[90567]: 2026-02-20 08:31:42.65800604 +0000 UTC m=+0.236437146 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Feb 20 03:31:42 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:31:42 localhost podman[90565]: 2026-02-20 08:31:42.673389804 +0000 UTC m=+0.256772113 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:31:42 localhost podman[90566]: 2026-02-20 08:31:42.687719525 +0000 UTC m=+0.269027970 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64) Feb 20 03:31:42 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:31:42 localhost podman[90565]: 2026-02-20 08:31:42.711113524 +0000 UTC m=+0.294495813 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1) Feb 20 03:31:42 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:31:43 localhost systemd[1]: tmp-crun.Gvb4hb.mount: Deactivated successfully. Feb 20 03:31:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:31:46 localhost systemd[1]: tmp-crun.n31or9.mount: Deactivated successfully. Feb 20 03:31:46 localhost podman[90659]: 2026-02-20 08:31:46.155233434 +0000 UTC m=+0.085191393 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:31:46 localhost podman[90659]: 2026-02-20 08:31:46.513459316 +0000 UTC m=+0.443417315 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:31:46 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:32:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:32:04 localhost systemd[1]: tmp-crun.nRlWSG.mount: Deactivated successfully. Feb 20 03:32:04 localhost podman[90781]: 2026-02-20 08:32:04.167803395 +0000 UTC m=+0.095815339 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13) Feb 20 03:32:04 localhost podman[90781]: 2026-02-20 08:32:04.366582712 +0000 UTC m=+0.294594676 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1) Feb 20 03:32:04 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:32:05 localhost sshd[90812]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:32:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:32:06 localhost recover_tripleo_nova_virtqemud[90834]: 63005 Feb 20 03:32:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:32:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:32:06 localhost systemd[1]: tmp-crun.8evlXO.mount: Deactivated successfully. Feb 20 03:32:06 localhost podman[90814]: 2026-02-20 08:32:06.168428807 +0000 UTC m=+0.102394963 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4) Feb 20 03:32:06 localhost podman[90817]: 2026-02-20 08:32:06.16823964 +0000 UTC m=+0.093275291 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:32:06 localhost podman[90816]: 2026-02-20 08:32:06.255824636 +0000 UTC m=+0.183609252 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, tcib_managed=true, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:32:06 localhost podman[90816]: 2026-02-20 08:32:06.266937708 +0000 UTC m=+0.194722304 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:32:06 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:32:06 localhost podman[90815]: 2026-02-20 08:32:06.309130756 +0000 UTC m=+0.241621817 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:32:06 localhost podman[90814]: 2026-02-20 08:32:06.321671681 +0000 UTC m=+0.255637897 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 20 03:32:06 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:32:06 localhost podman[90815]: 2026-02-20 08:32:06.344211045 +0000 UTC m=+0.276702126 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1766032510) Feb 20 03:32:06 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:32:06 localhost podman[90817]: 2026-02-20 08:32:06.403019515 +0000 UTC m=+0.328055226 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:32:06 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:32:13 localhost podman[90908]: 2026-02-20 08:32:13.149170349 +0000 UTC m=+0.076948618 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Feb 20 03:32:13 localhost systemd[1]: tmp-crun.F66HKU.mount: Deactivated successfully. Feb 20 03:32:13 localhost podman[90908]: 2026-02-20 08:32:13.216042447 +0000 UTC m=+0.143820736 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=ovn_metadata_agent, distribution-scope=public) Feb 20 03:32:13 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:32:13 localhost podman[90909]: 2026-02-20 08:32:13.269461121 +0000 UTC m=+0.190987008 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:32:13 localhost podman[90907]: 2026-02-20 08:32:13.218442321 +0000 UTC m=+0.146735426 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:32:13 localhost podman[90906]: 2026-02-20 08:32:13.321457681 +0000 UTC m=+0.250162958 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:32:13 localhost podman[90907]: 2026-02-20 08:32:13.358126689 +0000 UTC m=+0.286419794 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:32:13 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:32:13 localhost podman[90906]: 2026-02-20 08:32:13.373998778 +0000 UTC m=+0.302704105 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com) Feb 20 03:32:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:32:13 localhost podman[90909]: 2026-02-20 08:32:13.426584236 +0000 UTC m=+0.348110133 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:32:13 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:32:13 localhost sshd[90997]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:32:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:32:17 localhost podman[90999]: 2026-02-20 08:32:17.141051993 +0000 UTC m=+0.079421244 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:32:17 localhost podman[90999]: 2026-02-20 08:32:17.513792803 +0000 UTC m=+0.452162084 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4) Feb 20 03:32:17 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:32:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:32:35 localhost systemd[1]: tmp-crun.lF7ev8.mount: Deactivated successfully. Feb 20 03:32:35 localhost podman[91022]: 2026-02-20 08:32:35.154753253 +0000 UTC m=+0.096502381 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, batch=17.1_20260112.1, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 20 03:32:35 localhost podman[91022]: 2026-02-20 08:32:35.350091283 +0000 UTC m=+0.291840401 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:32:35 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:32:37 localhost systemd[1]: tmp-crun.gyw7w9.mount: Deactivated successfully. Feb 20 03:32:37 localhost podman[91053]: 2026-02-20 08:32:37.166883827 +0000 UTC m=+0.099433611 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, version=17.1.13) Feb 20 03:32:37 localhost podman[91052]: 2026-02-20 08:32:37.218529816 +0000 UTC m=+0.148182891 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond) Feb 20 03:32:37 localhost podman[91052]: 2026-02-20 08:32:37.226841452 +0000 UTC m=+0.156494537 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:32:37 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:32:37 localhost podman[91054]: 2026-02-20 08:32:37.269904168 +0000 UTC m=+0.198452518 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:32:37 localhost podman[91053]: 2026-02-20 08:32:37.281872126 +0000 UTC m=+0.214421990 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:32:37 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:32:37 localhost podman[91054]: 2026-02-20 08:32:37.327122478 +0000 UTC m=+0.255670788 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5) Feb 20 03:32:37 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:32:37 localhost sshd[91121]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:32:37 localhost podman[91051]: 2026-02-20 08:32:37.378292962 +0000 UTC m=+0.310108982 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:32:37 localhost podman[91051]: 2026-02-20 08:32:37.439173166 +0000 UTC m=+0.370989196 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 20 03:32:37 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:32:44 localhost systemd[1]: tmp-crun.aBNPpw.mount: Deactivated successfully. Feb 20 03:32:44 localhost podman[91139]: 2026-02-20 08:32:44.205876875 +0000 UTC m=+0.133402727 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 20 03:32:44 localhost podman[91138]: 2026-02-20 08:32:44.176016925 +0000 UTC m=+0.106186088 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510) Feb 20 03:32:44 localhost podman[91138]: 2026-02-20 08:32:44.262019232 +0000 UTC m=+0.192188345 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1) Feb 20 03:32:44 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:32:44 localhost podman[91145]: 2026-02-20 08:32:44.273917978 +0000 UTC m=+0.197866510 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:32:44 localhost podman[91145]: 2026-02-20 08:32:44.307057858 +0000 UTC m=+0.231006430 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:32:44 localhost podman[91137]: 2026-02-20 08:32:44.316416335 +0000 UTC m=+0.251642364 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., vcs-type=git) Feb 20 03:32:44 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:32:44 localhost podman[91139]: 2026-02-20 08:32:44.335182973 +0000 UTC m=+0.262708825 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 20 03:32:44 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:32:44 localhost podman[91137]: 2026-02-20 08:32:44.369152158 +0000 UTC m=+0.304378187 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:32:44 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:32:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:32:48 localhost systemd[1]: tmp-crun.njZnIe.mount: Deactivated successfully. Feb 20 03:32:48 localhost podman[91227]: 2026-02-20 08:32:48.144566671 +0000 UTC m=+0.088985980 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64) Feb 20 03:32:48 localhost podman[91227]: 2026-02-20 08:32:48.517086164 +0000 UTC m=+0.461505413 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:32:48 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:32:50 localhost podman[91350]: 2026-02-20 08:32:50.285358295 +0000 UTC m=+0.095870882 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_BRANCH=main, io.buildah.version=1.42.2, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Feb 20 03:32:50 localhost podman[91350]: 2026-02-20 08:32:50.418255344 +0000 UTC m=+0.228767931 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, vcs-type=git, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 20 03:32:58 localhost sshd[91491]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:33:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:33:06 localhost podman[91516]: 2026-02-20 08:33:06.160441225 +0000 UTC m=+0.094430527 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, version=17.1.13) Feb 20 03:33:06 localhost podman[91516]: 2026-02-20 08:33:06.371133938 +0000 UTC m=+0.305123200 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:33:06 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:33:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:33:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:33:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:33:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:33:08 localhost podman[91547]: 2026-02-20 08:33:08.156717721 +0000 UTC m=+0.089157074 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:33:08 localhost podman[91547]: 2026-02-20 08:33:08.191323267 +0000 UTC m=+0.123762670 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Feb 20 03:33:08 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:33:08 localhost podman[91546]: 2026-02-20 08:33:08.210511616 +0000 UTC m=+0.145661933 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 20 03:33:08 localhost podman[91546]: 2026-02-20 08:33:08.216595664 +0000 UTC m=+0.151745991 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond) Feb 20 03:33:08 localhost podman[91548]: 2026-02-20 08:33:08.176742777 +0000 UTC m=+0.102413441 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 20 03:33:08 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:33:08 localhost podman[91548]: 2026-02-20 08:33:08.256902104 +0000 UTC m=+0.182572718 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:33:08 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:33:08 localhost podman[91545]: 2026-02-20 08:33:08.3097378 +0000 UTC m=+0.244561307 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=) Feb 20 03:33:08 localhost podman[91545]: 2026-02-20 08:33:08.341050574 +0000 UTC m=+0.275874101 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:33:08 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:33:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:33:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:33:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:33:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:33:15 localhost systemd[1]: tmp-crun.NbN9Yy.mount: Deactivated successfully. Feb 20 03:33:15 localhost podman[91639]: 2026-02-20 08:33:15.218615522 +0000 UTC m=+0.145729696 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20260112.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:33:15 localhost podman[91639]: 2026-02-20 08:33:15.250935637 +0000 UTC m=+0.178049791 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Feb 20 03:33:15 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:33:15 localhost podman[91638]: 2026-02-20 08:33:15.265534026 +0000 UTC m=+0.196819937 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:33:15 localhost podman[91637]: 2026-02-20 08:33:15.183262093 +0000 UTC m=+0.114376189 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:33:15 localhost podman[91637]: 2026-02-20 08:33:15.317910587 +0000 UTC m=+0.249024633 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:33:15 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:33:15 localhost podman[91636]: 2026-02-20 08:33:15.374665194 +0000 UTC m=+0.306848084 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, release=1766032510) Feb 20 03:33:15 localhost podman[91638]: 2026-02-20 08:33:15.389571682 +0000 UTC m=+0.320857623 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:33:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:33:15 localhost podman[91636]: 2026-02-20 08:33:15.428106738 +0000 UTC m=+0.360289628 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-type=git, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_controller) Feb 20 03:33:15 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:33:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:33:19 localhost podman[91731]: 2026-02-20 08:33:19.139961806 +0000 UTC m=+0.079632482 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:33:19 localhost podman[91731]: 2026-02-20 08:33:19.522465605 +0000 UTC m=+0.462136251 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:33:19 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:33:23 localhost sshd[91753]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:33:24 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:33:24 localhost recover_tripleo_nova_virtqemud[91756]: 63005 Feb 20 03:33:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:33:24 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:33:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:33:37 localhost podman[91757]: 2026-02-20 08:33:37.128774586 +0000 UTC m=+0.073317257 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=) Feb 20 03:33:37 localhost podman[91757]: 2026-02-20 08:33:37.369784162 +0000 UTC m=+0.314326853 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:33:37 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:33:39 localhost systemd[1]: tmp-crun.j607v8.mount: Deactivated successfully. Feb 20 03:33:39 localhost podman[91787]: 2026-02-20 08:33:39.156599654 +0000 UTC m=+0.090539296 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public) Feb 20 03:33:39 localhost podman[91788]: 2026-02-20 08:33:39.208174451 +0000 UTC m=+0.133623692 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, vendor=Red Hat, Inc.) Feb 20 03:33:39 localhost podman[91788]: 2026-02-20 08:33:39.218404916 +0000 UTC m=+0.143854177 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 20 03:33:39 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:33:39 localhost podman[91787]: 2026-02-20 08:33:39.271230001 +0000 UTC m=+0.205169673 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vendor=Red Hat, Inc.) Feb 20 03:33:39 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:33:39 localhost podman[91786]: 2026-02-20 08:33:39.316916347 +0000 UTC m=+0.253087128 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 20 03:33:39 localhost podman[91790]: 2026-02-20 08:33:39.275460201 +0000 UTC m=+0.199649924 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.buildah.version=1.41.5) Feb 20 03:33:39 localhost podman[91790]: 2026-02-20 08:33:39.358261849 +0000 UTC m=+0.282451532 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 20 03:33:39 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:33:39 localhost podman[91786]: 2026-02-20 08:33:39.375094497 +0000 UTC m=+0.311265258 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible) Feb 20 03:33:39 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:33:40 localhost systemd[1]: tmp-crun.zJ7Cw5.mount: Deactivated successfully. Feb 20 03:33:44 localhost sshd[91877]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:33:46 localhost systemd[1]: tmp-crun.zxpavL.mount: Deactivated successfully. Feb 20 03:33:46 localhost podman[91879]: 2026-02-20 08:33:46.174591774 +0000 UTC m=+0.101276417 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 20 03:33:46 localhost systemd[1]: tmp-crun.kB9T6H.mount: Deactivated successfully. Feb 20 03:33:46 localhost podman[91880]: 2026-02-20 08:33:46.230843475 +0000 UTC m=+0.157499067 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 20 03:33:46 localhost podman[91881]: 2026-02-20 08:33:46.271670401 +0000 UTC m=+0.197252341 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=) Feb 20 03:33:46 localhost podman[91880]: 2026-02-20 08:33:46.277285164 +0000 UTC m=+0.203940766 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:33:46 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:33:46 localhost podman[91881]: 2026-02-20 08:33:46.322604888 +0000 UTC m=+0.248186828 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510) Feb 20 03:33:46 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:33:46 localhost podman[91882]: 2026-02-20 08:33:46.337979322 +0000 UTC m=+0.257957049 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, tcib_managed=true) Feb 20 03:33:46 localhost podman[91879]: 2026-02-20 08:33:46.3639262 +0000 UTC m=+0.290610843 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, version=17.1.13) Feb 20 03:33:46 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:33:46 localhost podman[91882]: 2026-02-20 08:33:46.393266803 +0000 UTC m=+0.313244560 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc.) Feb 20 03:33:46 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:33:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:33:50 localhost podman[91970]: 2026-02-20 08:33:50.148491294 +0000 UTC m=+0.083496760 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-type=git) Feb 20 03:33:50 localhost podman[91970]: 2026-02-20 08:33:50.550336789 +0000 UTC m=+0.485342235 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=nova_migration_target, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:33:50 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:34:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:34:08 localhost podman[92093]: 2026-02-20 08:34:08.183549069 +0000 UTC m=+0.117869558 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:34:08 localhost podman[92093]: 2026-02-20 08:34:08.352053244 +0000 UTC m=+0.286373743 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:34:08 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:34:10 localhost systemd[1]: tmp-crun.Uhh0EO.mount: Deactivated successfully. Feb 20 03:34:10 localhost podman[92123]: 2026-02-20 08:34:10.165958959 +0000 UTC m=+0.096027266 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true) Feb 20 03:34:10 localhost podman[92122]: 2026-02-20 08:34:10.142675903 +0000 UTC m=+0.079284970 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true) Feb 20 03:34:10 localhost podman[92121]: 2026-02-20 08:34:10.19719955 +0000 UTC m=+0.135990305 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z) Feb 20 03:34:10 localhost podman[92123]: 2026-02-20 08:34:10.223695986 +0000 UTC m=+0.153764323 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, version=17.1.13) Feb 20 03:34:10 localhost podman[92121]: 2026-02-20 08:34:10.233064394 +0000 UTC m=+0.171855189 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 20 03:34:10 localhost podman[92129]: 2026-02-20 08:34:10.178824045 +0000 UTC m=+0.102957269 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:34:10 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:34:10 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:34:10 localhost podman[92122]: 2026-02-20 08:34:10.276431239 +0000 UTC m=+0.213040376 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:34:10 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:34:10 localhost podman[92129]: 2026-02-20 08:34:10.318240125 +0000 UTC m=+0.242373399 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible) Feb 20 03:34:10 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:34:17 localhost podman[92208]: 2026-02-20 08:34:17.142708049 +0000 UTC m=+0.082488349 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, batch=17.1_20260112.1) Feb 20 03:34:17 localhost systemd[1]: tmp-crun.YmfCqC.mount: Deactivated successfully. Feb 20 03:34:17 localhost podman[92210]: 2026-02-20 08:34:17.161522298 +0000 UTC m=+0.092515788 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 20 03:34:17 localhost podman[92208]: 2026-02-20 08:34:17.203137008 +0000 UTC m=+0.142917288 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:34:17 localhost podman[92209]: 2026-02-20 08:34:17.203717427 +0000 UTC m=+0.137942727 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64) Feb 20 03:34:17 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:34:17 localhost podman[92210]: 2026-02-20 08:34:17.226063924 +0000 UTC m=+0.157057404 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13) Feb 20 03:34:17 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:34:17 localhost podman[92216]: 2026-02-20 08:34:17.316289281 +0000 UTC m=+0.244169825 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.13, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git) Feb 20 03:34:17 localhost podman[92209]: 2026-02-20 08:34:17.338527765 +0000 UTC m=+0.272753105 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 20 03:34:17 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:34:17 localhost podman[92216]: 2026-02-20 08:34:17.403800483 +0000 UTC m=+0.331680997 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container) Feb 20 03:34:17 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:34:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:34:21 localhost podman[92301]: 2026-02-20 08:34:21.136990596 +0000 UTC m=+0.075619918 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 20 03:34:21 localhost podman[92301]: 2026-02-20 08:34:21.520273269 +0000 UTC m=+0.458902571 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:34:21 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:34:30 localhost sshd[92324]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:34:35 localhost sshd[92326]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:34:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:34:39 localhost systemd[1]: tmp-crun.sVaaUm.mount: Deactivated successfully. Feb 20 03:34:39 localhost podman[92328]: 2026-02-20 08:34:39.154975706 +0000 UTC m=+0.089294959 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:34:39 localhost podman[92328]: 2026-02-20 08:34:39.354470136 +0000 UTC m=+0.288789369 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1) Feb 20 03:34:39 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:34:41 localhost podman[92356]: 2026-02-20 08:34:41.162774039 +0000 UTC m=+0.098591935 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:30Z, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com) Feb 20 03:34:41 localhost podman[92356]: 2026-02-20 08:34:41.195153806 +0000 UTC m=+0.130971702 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:34:41 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:34:41 localhost systemd[1]: tmp-crun.88M1hr.mount: Deactivated successfully. Feb 20 03:34:41 localhost podman[92357]: 2026-02-20 08:34:41.256774972 +0000 UTC m=+0.190202424 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, version=17.1.13, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-cron-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z) Feb 20 03:34:41 localhost podman[92359]: 2026-02-20 08:34:41.235489166 +0000 UTC m=+0.163757289 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git) Feb 20 03:34:41 localhost podman[92357]: 2026-02-20 08:34:41.290148909 +0000 UTC m=+0.223576341 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:34:41 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:34:41 localhost podman[92359]: 2026-02-20 08:34:41.320247904 +0000 UTC m=+0.248515937 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:34:41 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:34:41 localhost podman[92358]: 2026-02-20 08:34:41.375909417 +0000 UTC m=+0.305616795 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible) Feb 20 03:34:41 localhost podman[92358]: 2026-02-20 08:34:41.415102843 +0000 UTC m=+0.344810181 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:34:41 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:34:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:34:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:34:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:34:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:34:48 localhost systemd[1]: tmp-crun.dse0Vf.mount: Deactivated successfully. Feb 20 03:34:48 localhost podman[92448]: 2026-02-20 08:34:48.18957127 +0000 UTC m=+0.129317420 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:34:48 localhost podman[92449]: 2026-02-20 08:34:48.145803294 +0000 UTC m=+0.085845073 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 20 03:34:48 localhost podman[92448]: 2026-02-20 08:34:48.234964798 +0000 UTC m=+0.174710978 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13) Feb 20 03:34:48 localhost podman[92450]: 2026-02-20 08:34:48.24188439 +0000 UTC m=+0.174058817 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 20 03:34:48 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:34:48 localhost podman[92460]: 2026-02-20 08:34:48.16843465 +0000 UTC m=+0.092969761 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step5, managed_by=tripleo_ansible, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, vcs-type=git) Feb 20 03:34:48 localhost podman[92449]: 2026-02-20 08:34:48.283058028 +0000 UTC m=+0.223099857 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64) Feb 20 03:34:48 localhost podman[92450]: 2026-02-20 08:34:48.291519928 +0000 UTC m=+0.223694415 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=) Feb 20 03:34:48 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:34:48 localhost podman[92460]: 2026-02-20 08:34:48.299357499 +0000 UTC m=+0.223892630 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64) Feb 20 03:34:48 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:34:48 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:34:52 localhost podman[92541]: 2026-02-20 08:34:52.112381159 +0000 UTC m=+0.057497171 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:34:52 localhost podman[92541]: 2026-02-20 08:34:52.512107848 +0000 UTC m=+0.457223920 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z) Feb 20 03:34:52 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:35:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:35:06 localhost recover_tripleo_nova_virtqemud[92644]: 63005 Feb 20 03:35:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:35:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:35:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:35:10 localhost podman[92645]: 2026-02-20 08:35:10.157892876 +0000 UTC m=+0.090607709 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 20 03:35:10 localhost podman[92645]: 2026-02-20 08:35:10.341923869 +0000 UTC m=+0.274638672 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=) Feb 20 03:35:10 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:35:12 localhost podman[92681]: 2026-02-20 08:35:12.157447224 +0000 UTC m=+0.082007204 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:35:12 localhost systemd[1]: tmp-crun.whPCoY.mount: Deactivated successfully. Feb 20 03:35:12 localhost podman[92682]: 2026-02-20 08:35:12.224811237 +0000 UTC m=+0.146130888 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:35:12 localhost podman[92674]: 2026-02-20 08:35:12.19827652 +0000 UTC m=+0.135093598 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:35:12 localhost podman[92675]: 2026-02-20 08:35:12.259878256 +0000 UTC m=+0.188866393 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:35:12 localhost podman[92675]: 2026-02-20 08:35:12.266921662 +0000 UTC m=+0.195909729 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510) Feb 20 03:35:12 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:35:12 localhost podman[92682]: 2026-02-20 08:35:12.278064535 +0000 UTC m=+0.199384186 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:35:12 localhost podman[92681]: 2026-02-20 08:35:12.276353103 +0000 UTC m=+0.200913073 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5) Feb 20 03:35:12 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:35:12 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:35:12 localhost podman[92674]: 2026-02-20 08:35:12.382950963 +0000 UTC m=+0.319768071 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:35:12 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:35:13 localhost sshd[92765]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:35:15 localhost sshd[92767]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:35:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:35:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:35:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:35:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:35:19 localhost systemd[1]: tmp-crun.fptsZk.mount: Deactivated successfully. Feb 20 03:35:19 localhost podman[92771]: 2026-02-20 08:35:19.163462665 +0000 UTC m=+0.097553423 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:35:19 localhost podman[92769]: 2026-02-20 08:35:19.197825922 +0000 UTC m=+0.132887659 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com) Feb 20 03:35:19 localhost podman[92769]: 2026-02-20 08:35:19.221412919 +0000 UTC m=+0.156474656 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, release=1766032510, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:35:19 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:35:19 localhost podman[92772]: 2026-02-20 08:35:19.298758998 +0000 UTC m=+0.227666776 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, distribution-scope=public) Feb 20 03:35:19 localhost podman[92771]: 2026-02-20 08:35:19.323243272 +0000 UTC m=+0.257333980 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:35:19 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:35:19 localhost podman[92770]: 2026-02-20 08:35:19.409215817 +0000 UTC m=+0.340142068 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 20 03:35:19 localhost podman[92770]: 2026-02-20 08:35:19.420915667 +0000 UTC m=+0.351841898 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z) Feb 20 03:35:19 localhost podman[92772]: 2026-02-20 08:35:19.428331316 +0000 UTC m=+0.357239154 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5) Feb 20 03:35:19 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:35:19 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:35:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:35:23 localhost podman[92859]: 2026-02-20 08:35:23.129850443 +0000 UTC m=+0.069003134 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, container_name=nova_migration_target, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510) Feb 20 03:35:23 localhost podman[92859]: 2026-02-20 08:35:23.532227414 +0000 UTC m=+0.471380135 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, version=17.1.13) Feb 20 03:35:23 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:35:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:35:41 localhost systemd[1]: tmp-crun.EVh6S3.mount: Deactivated successfully. Feb 20 03:35:41 localhost podman[92882]: 2026-02-20 08:35:41.163987563 +0000 UTC m=+0.096956874 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Feb 20 03:35:41 localhost podman[92882]: 2026-02-20 08:35:41.357005432 +0000 UTC m=+0.289974693 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, container_name=metrics_qdr, io.buildah.version=1.41.5) Feb 20 03:35:41 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:35:43 localhost podman[92911]: 2026-02-20 08:35:43.165701948 +0000 UTC m=+0.099600696 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 20 03:35:43 localhost systemd[1]: tmp-crun.FE5lAj.mount: Deactivated successfully. Feb 20 03:35:43 localhost podman[92912]: 2026-02-20 08:35:43.229281865 +0000 UTC m=+0.161277644 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.13, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team) Feb 20 03:35:43 localhost podman[92912]: 2026-02-20 08:35:43.268200443 +0000 UTC m=+0.200196202 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:35:43 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:35:43 localhost podman[92913]: 2026-02-20 08:35:43.322880945 +0000 UTC m=+0.251343775 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, container_name=collectd, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Feb 20 03:35:43 localhost podman[92913]: 2026-02-20 08:35:43.337055971 +0000 UTC m=+0.265518841 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, distribution-scope=public) Feb 20 03:35:43 localhost podman[92911]: 2026-02-20 08:35:43.345669866 +0000 UTC m=+0.279568644 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, distribution-scope=public, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510) Feb 20 03:35:43 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:35:43 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:35:43 localhost podman[92914]: 2026-02-20 08:35:43.273751464 +0000 UTC m=+0.199133810 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:35:43 localhost podman[92914]: 2026-02-20 08:35:43.411188183 +0000 UTC m=+0.336570489 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:35:43 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:35:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:35:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:35:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:35:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:35:50 localhost podman[93002]: 2026-02-20 08:35:50.14549376 +0000 UTC m=+0.079148366 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:35:50 localhost podman[93002]: 2026-02-20 08:35:50.190160654 +0000 UTC m=+0.123815270 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:35:50 localhost podman[93001]: 2026-02-20 08:35:50.190208306 +0000 UTC m=+0.124143411 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:35:50 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:35:50 localhost podman[93000]: 2026-02-20 08:35:50.258853318 +0000 UTC m=+0.196501097 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git) Feb 20 03:35:50 localhost podman[93001]: 2026-02-20 08:35:50.275102479 +0000 UTC m=+0.209037584 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510) Feb 20 03:35:50 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:35:50 localhost podman[93003]: 2026-02-20 08:35:50.359773914 +0000 UTC m=+0.288022584 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:35:50 localhost podman[93000]: 2026-02-20 08:35:50.380734268 +0000 UTC m=+0.318382057 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:35:50 localhost podman[93003]: 2026-02-20 08:35:50.392072978 +0000 UTC m=+0.320321658 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true) Feb 20 03:35:50 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:35:50 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:35:51 localhost systemd[1]: tmp-crun.t9BRVD.mount: Deactivated successfully. Feb 20 03:35:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:35:54 localhost podman[93087]: 2026-02-20 08:35:54.12624439 +0000 UTC m=+0.071743648 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510) Feb 20 03:35:54 localhost podman[93087]: 2026-02-20 08:35:54.527182778 +0000 UTC m=+0.472682046 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:35:54 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:35:57 localhost sshd[93186]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:35:58 localhost sshd[93188]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:35:59 localhost sshd[93190]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:36:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:36:12 localhost systemd[1]: tmp-crun.q3Qwka.mount: Deactivated successfully. Feb 20 03:36:12 localhost podman[93192]: 2026-02-20 08:36:12.15781451 +0000 UTC m=+0.095028525 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:36:12 localhost podman[93192]: 2026-02-20 08:36:12.339849031 +0000 UTC m=+0.277063056 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 20 03:36:12 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:36:14 localhost systemd[1]: tmp-crun.dPVgkz.mount: Deactivated successfully. Feb 20 03:36:14 localhost podman[93222]: 2026-02-20 08:36:14.21431452 +0000 UTC m=+0.148351016 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:36:14 localhost systemd[1]: tmp-crun.zhzBXK.mount: Deactivated successfully. Feb 20 03:36:14 localhost podman[93223]: 2026-02-20 08:36:14.226587209 +0000 UTC m=+0.155567039 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:36:14 localhost podman[93222]: 2026-02-20 08:36:14.273087399 +0000 UTC m=+0.207123895 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=) Feb 20 03:36:14 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:36:14 localhost podman[93223]: 2026-02-20 08:36:14.287174972 +0000 UTC m=+0.216154862 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=logrotate_crond, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:36:14 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:36:14 localhost podman[93224]: 2026-02-20 08:36:14.275821883 +0000 UTC m=+0.201046048 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 20 03:36:14 localhost podman[93224]: 2026-02-20 08:36:14.359270261 +0000 UTC m=+0.284494406 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, container_name=collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:36:14 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:36:14 localhost podman[93225]: 2026-02-20 08:36:14.382734373 +0000 UTC m=+0.306375428 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:36:14 localhost podman[93225]: 2026-02-20 08:36:14.439376856 +0000 UTC m=+0.363017901 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:36:14 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:36:15 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:36:15 localhost recover_tripleo_nova_virtqemud[93315]: 63005 Feb 20 03:36:15 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:36:15 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:36:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:36:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:36:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:36:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:36:21 localhost systemd[1]: tmp-crun.YyP2J4.mount: Deactivated successfully. Feb 20 03:36:21 localhost podman[93317]: 2026-02-20 08:36:21.169421395 +0000 UTC m=+0.097279854 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, tcib_managed=true, container_name=iscsid, release=1766032510, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3) Feb 20 03:36:21 localhost podman[93317]: 2026-02-20 08:36:21.207422495 +0000 UTC m=+0.135280904 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:36:21 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:36:21 localhost podman[93319]: 2026-02-20 08:36:21.260609841 +0000 UTC m=+0.181709092 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, container_name=nova_compute, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:36:21 localhost podman[93316]: 2026-02-20 08:36:21.225898373 +0000 UTC m=+0.155822436 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:36:21 localhost podman[93319]: 2026-02-20 08:36:21.289091728 +0000 UTC m=+0.210190969 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:36:21 localhost podman[93316]: 2026-02-20 08:36:21.305597076 +0000 UTC m=+0.235521069 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public) Feb 20 03:36:21 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:36:21 localhost podman[93318]: 2026-02-20 08:36:21.322202006 +0000 UTC m=+0.244518275 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510) Feb 20 03:36:21 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:36:21 localhost podman[93318]: 2026-02-20 08:36:21.367890603 +0000 UTC m=+0.290206862 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:56:19Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Feb 20 03:36:21 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:36:22 localhost systemd[1]: tmp-crun.26zMlR.mount: Deactivated successfully. Feb 20 03:36:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:36:25 localhost podman[93412]: 2026-02-20 08:36:25.144193122 +0000 UTC m=+0.082290544 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:36:25 localhost podman[93412]: 2026-02-20 08:36:25.516395965 +0000 UTC m=+0.454493387 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container) Feb 20 03:36:25 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:36:40 localhost sshd[93436]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:36:41 localhost sshd[93437]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:36:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:36:43 localhost systemd[1]: tmp-crun.th9NNV.mount: Deactivated successfully. Feb 20 03:36:43 localhost podman[93439]: 2026-02-20 08:36:43.283358373 +0000 UTC m=+0.107736297 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step1, release=1766032510, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:36:43 localhost podman[93439]: 2026-02-20 08:36:43.484180873 +0000 UTC m=+0.308558797 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Feb 20 03:36:43 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:36:45 localhost podman[93472]: 2026-02-20 08:36:45.127572492 +0000 UTC m=+0.068114477 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Feb 20 03:36:45 localhost systemd[1]: tmp-crun.oRLO6e.mount: Deactivated successfully. Feb 20 03:36:45 localhost podman[93469]: 2026-02-20 08:36:45.144711879 +0000 UTC m=+0.086553375 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.buildah.version=1.41.5) Feb 20 03:36:45 localhost podman[93472]: 2026-02-20 08:36:45.153151278 +0000 UTC m=+0.093693343 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc.) Feb 20 03:36:45 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:36:45 localhost podman[93470]: 2026-02-20 08:36:45.186531226 +0000 UTC m=+0.126226055 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:36:45 localhost podman[93470]: 2026-02-20 08:36:45.196887035 +0000 UTC m=+0.136581834 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team) Feb 20 03:36:45 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:36:45 localhost podman[93471]: 2026-02-20 08:36:45.233714848 +0000 UTC m=+0.172702996 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510) Feb 20 03:36:45 localhost podman[93471]: 2026-02-20 08:36:45.244806909 +0000 UTC m=+0.183795097 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 20 03:36:45 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:36:45 localhost podman[93469]: 2026-02-20 08:36:45.298726948 +0000 UTC m=+0.240568494 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:36:45 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:36:50 localhost sshd[93559]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:36:51 localhost systemd[1]: tmp-crun.w4YXm1.mount: Deactivated successfully. Feb 20 03:36:51 localhost podman[93561]: 2026-02-20 08:36:51.578582807 +0000 UTC m=+0.074449903 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, container_name=ovn_controller, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:36:51 localhost podman[93561]: 2026-02-20 08:36:51.607130175 +0000 UTC m=+0.102997281 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=) Feb 20 03:36:51 localhost podman[93564]: 2026-02-20 08:36:51.632202797 +0000 UTC m=+0.124377858 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:36:51 localhost podman[93563]: 2026-02-20 08:36:51.682842535 +0000 UTC m=+0.175879413 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:36:51 localhost podman[93564]: 2026-02-20 08:36:51.691981756 +0000 UTC m=+0.184156787 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, distribution-scope=public, tcib_managed=true, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 20 03:36:51 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:36:51 localhost podman[93563]: 2026-02-20 08:36:51.733015299 +0000 UTC m=+0.226052157 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=) Feb 20 03:36:51 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:36:51 localhost podman[93562]: 2026-02-20 08:36:51.745510414 +0000 UTC m=+0.239041937 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, container_name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1) Feb 20 03:36:51 localhost podman[93562]: 2026-02-20 08:36:51.753142118 +0000 UTC m=+0.246673631 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:36:51 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:36:51 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:36:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:36:56 localhost systemd[1]: tmp-crun.yoKNgr.mount: Deactivated successfully. Feb 20 03:36:56 localhost podman[93654]: 2026-02-20 08:36:56.147885059 +0000 UTC m=+0.088862836 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target) Feb 20 03:36:56 localhost podman[93654]: 2026-02-20 08:36:56.480972248 +0000 UTC m=+0.421949965 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.13, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true) Feb 20 03:36:56 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:37:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:37:14 localhost podman[93755]: 2026-02-20 08:37:14.146336747 +0000 UTC m=+0.082752688 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, batch=17.1_20260112.1) Feb 20 03:37:14 localhost podman[93755]: 2026-02-20 08:37:14.340148671 +0000 UTC m=+0.276564622 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5) Feb 20 03:37:14 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:37:16 localhost systemd[1]: tmp-crun.P7LJh3.mount: Deactivated successfully. Feb 20 03:37:16 localhost podman[93786]: 2026-02-20 08:37:16.192763178 +0000 UTC m=+0.125221894 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:37:16 localhost podman[93785]: 2026-02-20 08:37:16.203259141 +0000 UTC m=+0.136558733 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:37:16 localhost podman[93786]: 2026-02-20 08:37:16.204296943 +0000 UTC m=+0.136755739 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 20 03:37:16 localhost podman[93793]: 2026-02-20 08:37:16.162717444 +0000 UTC m=+0.086384399 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:47Z) Feb 20 03:37:16 localhost podman[93793]: 2026-02-20 08:37:16.250117193 +0000 UTC m=+0.173784108 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 20 03:37:16 localhost podman[93785]: 2026-02-20 08:37:16.260152061 +0000 UTC m=+0.193451653 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:37:16 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:37:16 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:37:16 localhost podman[93787]: 2026-02-20 08:37:16.290371951 +0000 UTC m=+0.221017322 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:37:16 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:37:16 localhost podman[93787]: 2026-02-20 08:37:16.33124682 +0000 UTC m=+0.261892181 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:37:16 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:37:22 localhost systemd[1]: tmp-crun.ERRnGS.mount: Deactivated successfully. Feb 20 03:37:22 localhost podman[93876]: 2026-02-20 08:37:22.154547657 +0000 UTC m=+0.081587701 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:37:22 localhost podman[93875]: 2026-02-20 08:37:22.174054368 +0000 UTC m=+0.099686858 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, release=1766032510, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible) Feb 20 03:37:22 localhost podman[93874]: 2026-02-20 08:37:22.212065018 +0000 UTC m=+0.145174619 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Feb 20 03:37:22 localhost podman[93876]: 2026-02-20 08:37:22.219071524 +0000 UTC m=+0.146111578 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 20 03:37:22 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:37:22 localhost podman[93874]: 2026-02-20 08:37:22.232463605 +0000 UTC m=+0.165573236 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1766032510, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:37:22 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:37:22 localhost podman[93875]: 2026-02-20 08:37:22.308196435 +0000 UTC m=+0.233828925 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid) Feb 20 03:37:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:37:22 localhost podman[93882]: 2026-02-20 08:37:22.321606698 +0000 UTC m=+0.242518823 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5) Feb 20 03:37:22 localhost podman[93882]: 2026-02-20 08:37:22.38014368 +0000 UTC m=+0.301055795 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13) Feb 20 03:37:22 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:37:23 localhost systemd[1]: tmp-crun.J5bfoq.mount: Deactivated successfully. Feb 20 03:37:24 localhost sshd[93965]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:37:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:37:27 localhost podman[93967]: 2026-02-20 08:37:27.123700473 +0000 UTC m=+0.065198907 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1) Feb 20 03:37:27 localhost podman[93967]: 2026-02-20 08:37:27.500054574 +0000 UTC m=+0.441553058 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 20 03:37:27 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:37:36 localhost sshd[93991]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:37:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:37:45 localhost systemd[1]: tmp-crun.N2fX2x.mount: Deactivated successfully. Feb 20 03:37:45 localhost podman[93993]: 2026-02-20 08:37:45.156817166 +0000 UTC m=+0.090122844 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:37:45 localhost podman[93993]: 2026-02-20 08:37:45.332068549 +0000 UTC m=+0.265374157 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 20 03:37:45 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:37:47 localhost systemd[1]: tmp-crun.HUPehq.mount: Deactivated successfully. Feb 20 03:37:47 localhost podman[94024]: 2026-02-20 08:37:47.17117045 +0000 UTC m=+0.104695283 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13) Feb 20 03:37:47 localhost podman[94026]: 2026-02-20 08:37:47.217878936 +0000 UTC m=+0.145421965 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-type=git, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:37:47 localhost podman[94024]: 2026-02-20 08:37:47.225598904 +0000 UTC m=+0.159123737 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 20 03:37:47 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:37:47 localhost podman[94026]: 2026-02-20 08:37:47.259330062 +0000 UTC m=+0.186873061 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:37:47 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:37:47 localhost podman[94027]: 2026-02-20 08:37:47.316333046 +0000 UTC m=+0.240830202 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:37:47 localhost podman[94025]: 2026-02-20 08:37:47.367201222 +0000 UTC m=+0.297997761 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:37:47 localhost podman[94025]: 2026-02-20 08:37:47.380083158 +0000 UTC m=+0.310879637 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:37:47 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:37:47 localhost podman[94027]: 2026-02-20 08:37:47.431326074 +0000 UTC m=+0.355823230 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:37:47 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:37:53 localhost podman[94116]: 2026-02-20 08:37:53.147992711 +0000 UTC m=+0.082457428 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 20 03:37:53 localhost systemd[1]: tmp-crun.RlVZvF.mount: Deactivated successfully. Feb 20 03:37:53 localhost podman[94116]: 2026-02-20 08:37:53.19713202 +0000 UTC m=+0.131596747 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container) Feb 20 03:37:53 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:37:53 localhost podman[94119]: 2026-02-20 08:37:53.176990097 +0000 UTC m=+0.101609400 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true) Feb 20 03:37:53 localhost podman[94119]: 2026-02-20 08:37:53.311421121 +0000 UTC m=+0.236040434 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git) Feb 20 03:37:53 localhost podman[94117]: 2026-02-20 08:37:53.325714924 +0000 UTC m=+0.257536719 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:37:53 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:37:53 localhost podman[94117]: 2026-02-20 08:37:53.364897764 +0000 UTC m=+0.296719569 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:37:53 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:37:53 localhost podman[94118]: 2026-02-20 08:37:53.199157622 +0000 UTC m=+0.129558224 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com) Feb 20 03:37:53 localhost podman[94118]: 2026-02-20 08:37:53.449249381 +0000 UTC m=+0.379650003 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:37:53 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:37:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:37:58 localhost systemd[1]: tmp-crun.LkwIx4.mount: Deactivated successfully. Feb 20 03:37:58 localhost podman[94208]: 2026-02-20 08:37:58.17031129 +0000 UTC m=+0.105922625 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:37:58 localhost podman[94208]: 2026-02-20 08:37:58.570207907 +0000 UTC m=+0.505819212 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:37:58 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:38:10 localhost sshd[94307]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:38:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:38:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:38:16 localhost recover_tripleo_nova_virtqemud[94311]: 63005 Feb 20 03:38:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:38:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:38:16 localhost podman[94309]: 2026-02-20 08:38:16.151014577 +0000 UTC m=+0.087637819 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, architecture=x86_64) Feb 20 03:38:16 localhost podman[94309]: 2026-02-20 08:38:16.344785815 +0000 UTC m=+0.281408997 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z) Feb 20 03:38:16 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:38:18 localhost systemd[1]: tmp-crun.faAJTs.mount: Deactivated successfully. Feb 20 03:38:18 localhost podman[94343]: 2026-02-20 08:38:18.218968331 +0000 UTC m=+0.141049760 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2026-01-12T22:10:15Z) Feb 20 03:38:18 localhost podman[94343]: 2026-02-20 08:38:18.229438384 +0000 UTC m=+0.151519813 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible) Feb 20 03:38:18 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:38:18 localhost podman[94342]: 2026-02-20 08:38:18.185959251 +0000 UTC m=+0.115663696 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z) Feb 20 03:38:18 localhost podman[94344]: 2026-02-20 08:38:18.289775229 +0000 UTC m=+0.210153645 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 20 03:38:18 localhost podman[94342]: 2026-02-20 08:38:18.315060481 +0000 UTC m=+0.244764936 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Feb 20 03:38:18 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:38:18 localhost podman[94341]: 2026-02-20 08:38:18.378828521 +0000 UTC m=+0.310967940 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:38:18 localhost podman[94344]: 2026-02-20 08:38:18.399389776 +0000 UTC m=+0.319768222 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64) Feb 20 03:38:18 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:38:18 localhost podman[94341]: 2026-02-20 08:38:18.412883314 +0000 UTC m=+0.345022753 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git) Feb 20 03:38:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:38:23 localhost sshd[94433]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:38:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:38:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:38:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:38:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:38:24 localhost systemd[1]: tmp-crun.gkH0eJ.mount: Deactivated successfully. Feb 20 03:38:24 localhost podman[94435]: 2026-02-20 08:38:24.067787109 +0000 UTC m=+0.082661995 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5) Feb 20 03:38:24 localhost systemd[1]: tmp-crun.3ZoIUs.mount: Deactivated successfully. Feb 20 03:38:24 localhost podman[94436]: 2026-02-20 08:38:24.086844868 +0000 UTC m=+0.097034339 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64) Feb 20 03:38:24 localhost podman[94436]: 2026-02-20 08:38:24.099948963 +0000 UTC m=+0.110138464 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:38:24 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:38:24 localhost podman[94435]: 2026-02-20 08:38:24.120166878 +0000 UTC m=+0.135041764 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:38:24 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully. Feb 20 03:38:24 localhost podman[94438]: 2026-02-20 08:38:24.19172917 +0000 UTC m=+0.194988747 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 20 03:38:24 localhost podman[94437]: 2026-02-20 08:38:24.168831901 +0000 UTC m=+0.178848467 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:38:24 localhost podman[94438]: 2026-02-20 08:38:24.247009697 +0000 UTC m=+0.250269234 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, architecture=x86_64, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 20 03:38:24 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:38:24 localhost podman[94437]: 2026-02-20 08:38:24.30341045 +0000 UTC m=+0.313427006 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, release=1766032510, distribution-scope=public, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64) Feb 20 03:38:24 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:38:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:38:29 localhost systemd[1]: tmp-crun.cDLPeY.mount: Deactivated successfully. Feb 20 03:38:29 localhost podman[94527]: 2026-02-20 08:38:29.147033787 +0000 UTC m=+0.085740359 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:38:29 localhost podman[94527]: 2026-02-20 08:38:29.525126901 +0000 UTC m=+0.463833493 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, release=1766032510, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:38:29 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:38:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:38:47 localhost podman[94551]: 2026-02-20 08:38:47.13766635 +0000 UTC m=+0.074913435 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible) Feb 20 03:38:47 localhost podman[94551]: 2026-02-20 08:38:47.36802444 +0000 UTC m=+0.305271525 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1766032510, container_name=metrics_qdr, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:38:47 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:38:49 localhost systemd[1]: tmp-crun.jaQU2P.mount: Deactivated successfully. Feb 20 03:38:49 localhost podman[94582]: 2026-02-20 08:38:49.165322557 +0000 UTC m=+0.099289108 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team) Feb 20 03:38:49 localhost podman[94582]: 2026-02-20 08:38:49.205222821 +0000 UTC m=+0.139189372 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:38:49 localhost podman[94581]: 2026-02-20 08:38:49.205488289 +0000 UTC m=+0.141367040 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5) Feb 20 03:38:49 localhost podman[94583]: 2026-02-20 08:38:49.264763091 +0000 UTC m=+0.194424600 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:38:49 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:38:49 localhost podman[94583]: 2026-02-20 08:38:49.276007588 +0000 UTC m=+0.205669107 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.13, container_name=collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 20 03:38:49 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:38:49 localhost podman[94581]: 2026-02-20 08:38:49.311383671 +0000 UTC m=+0.247262422 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, tcib_managed=true, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:38:49 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:38:49 localhost podman[94585]: 2026-02-20 08:38:49.369756275 +0000 UTC m=+0.295076349 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z) Feb 20 03:38:49 localhost podman[94585]: 2026-02-20 08:38:49.427040516 +0000 UTC m=+0.352360580 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4) Feb 20 03:38:49 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:38:50 localhost systemd[1]: tmp-crun.dHmQTI.mount: Deactivated successfully. Feb 20 03:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:38:55 localhost systemd[1]: tmp-crun.1IBQlj.mount: Deactivated successfully. Feb 20 03:38:55 localhost podman[94681]: 2026-02-20 08:38:55.180936181 +0000 UTC m=+0.110703392 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:38:55 localhost podman[94676]: 2026-02-20 08:38:55.145606379 +0000 UTC m=+0.085818903 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:38:55 localhost podman[94678]: 2026-02-20 08:38:55.201070573 +0000 UTC m=+0.133931799 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:38:55 localhost podman[94676]: 2026-02-20 08:38:55.224308261 +0000 UTC m=+0.164520775 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 20 03:38:55 localhost podman[94676]: unhealthy Feb 20 03:38:55 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:38:55 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:38:55 localhost podman[94681]: 2026-02-20 08:38:55.266043201 +0000 UTC m=+0.195810402 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5) Feb 20 03:38:55 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:38:55 localhost podman[94677]: 2026-02-20 08:38:55.317516261 +0000 UTC m=+0.255682692 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:38:55 localhost podman[94678]: 2026-02-20 08:38:55.327900042 +0000 UTC m=+0.260761278 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 20 03:38:55 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:38:55 localhost podman[94677]: 2026-02-20 08:38:55.357235789 +0000 UTC m=+0.295402250 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com) Feb 20 03:38:55 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:38:57 localhost sshd[94771]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:39:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:39:00 localhost systemd[1]: tmp-crun.EI8Tkp.mount: Deactivated successfully. Feb 20 03:39:00 localhost podman[94773]: 2026-02-20 08:39:00.141004545 +0000 UTC m=+0.083510752 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 20 03:39:00 localhost podman[94773]: 2026-02-20 08:39:00.513058042 +0000 UTC m=+0.455564289 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 20 03:39:00 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:39:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:39:18 localhost systemd[1]: tmp-crun.oe2DZI.mount: Deactivated successfully. Feb 20 03:39:18 localhost podman[94918]: 2026-02-20 08:39:18.159045102 +0000 UTC m=+0.101023023 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:39:18 localhost podman[94918]: 2026-02-20 08:39:18.353440369 +0000 UTC m=+0.295418300 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64) Feb 20 03:39:18 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:39:19 localhost sshd[94946]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:39:20 localhost podman[94949]: 2026-02-20 08:39:20.168492047 +0000 UTC m=+0.093017175 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, container_name=logrotate_crond) Feb 20 03:39:20 localhost podman[94949]: 2026-02-20 08:39:20.208952318 +0000 UTC m=+0.133477446 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container) Feb 20 03:39:20 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:39:20 localhost podman[94948]: 2026-02-20 08:39:20.210070982 +0000 UTC m=+0.139127930 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:39:20 localhost podman[94956]: 2026-02-20 08:39:20.263003018 +0000 UTC m=+0.177958010 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:39:20 localhost podman[94956]: 2026-02-20 08:39:20.315887522 +0000 UTC m=+0.230842504 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13) Feb 20 03:39:20 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:39:20 localhost podman[94950]: 2026-02-20 08:39:20.330110372 +0000 UTC m=+0.250920736 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:39:20 localhost podman[94948]: 2026-02-20 08:39:20.342341439 +0000 UTC m=+0.271398437 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:39:20 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:39:20 localhost podman[94950]: 2026-02-20 08:39:20.363096991 +0000 UTC m=+0.283907355 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git) Feb 20 03:39:20 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:39:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:39:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:39:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:39:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:39:26 localhost systemd[1]: tmp-crun.80yMm2.mount: Deactivated successfully. Feb 20 03:39:26 localhost podman[95043]: 2026-02-20 08:39:26.153731441 +0000 UTC m=+0.089255249 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:39:26 localhost systemd[1]: tmp-crun.xw4tMN.mount: Deactivated successfully. Feb 20 03:39:26 localhost podman[95044]: 2026-02-20 08:39:26.215294693 +0000 UTC m=+0.147861700 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, config_id=tripleo_step5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 20 03:39:26 localhost podman[95043]: 2026-02-20 08:39:26.241849533 +0000 UTC m=+0.177373311 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:39:26 localhost podman[95044]: 2026-02-20 08:39:26.242246166 +0000 UTC m=+0.174813173 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:39:26 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:39:26 localhost podman[95042]: 2026-02-20 08:39:26.253250296 +0000 UTC m=+0.190978173 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:39:26 localhost podman[95042]: 2026-02-20 08:39:26.261890933 +0000 UTC m=+0.199618800 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, container_name=iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:39:26 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:39:26 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully. Feb 20 03:39:26 localhost podman[95041]: 2026-02-20 08:39:26.307522093 +0000 UTC m=+0.248393777 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:39:26 localhost podman[95041]: 2026-02-20 08:39:26.328850512 +0000 UTC m=+0.269722236 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team) Feb 20 03:39:26 localhost podman[95041]: unhealthy Feb 20 03:39:26 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:39:26 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:39:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:39:31 localhost podman[95135]: 2026-02-20 08:39:31.14398142 +0000 UTC m=+0.078503667 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64) Feb 20 03:39:31 localhost podman[95135]: 2026-02-20 08:39:31.504938974 +0000 UTC m=+0.439461231 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.13) Feb 20 03:39:31 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:39:44 localhost sshd[95158]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:39:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:39:49 localhost podman[95160]: 2026-02-20 08:39:49.140764695 +0000 UTC m=+0.082858181 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, release=1766032510) Feb 20 03:39:49 localhost podman[95160]: 2026-02-20 08:39:49.319835019 +0000 UTC m=+0.261928445 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, release=1766032510, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 20 03:39:49 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:39:51 localhost systemd[1]: tmp-crun.jd7FuK.mount: Deactivated successfully. Feb 20 03:39:51 localhost podman[95189]: 2026-02-20 08:39:51.165257637 +0000 UTC m=+0.099301299 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:39:51 localhost podman[95189]: 2026-02-20 08:39:51.203958883 +0000 UTC m=+0.138002485 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:39:51 localhost systemd[1]: tmp-crun.Y0Y2pd.mount: Deactivated successfully. Feb 20 03:39:51 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:39:51 localhost podman[95190]: 2026-02-20 08:39:51.208219264 +0000 UTC m=+0.138660345 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3) Feb 20 03:39:51 localhost podman[95190]: 2026-02-20 08:39:51.294073397 +0000 UTC m=+0.224514488 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13) Feb 20 03:39:51 localhost podman[95188]: 2026-02-20 08:39:51.305595853 +0000 UTC m=+0.241367109 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team) Feb 20 03:39:51 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:39:51 localhost podman[95191]: 2026-02-20 08:39:51.259992934 +0000 UTC m=+0.187643029 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:39:51 localhost podman[95191]: 2026-02-20 08:39:51.343894197 +0000 UTC m=+0.271544222 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:39:51 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:39:51 localhost podman[95188]: 2026-02-20 08:39:51.361901753 +0000 UTC m=+0.297673009 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:39:51 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:39:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:39:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:39:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:39:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:39:57 localhost podman[95277]: 2026-02-20 08:39:57.13812844 +0000 UTC m=+0.075901126 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, vcs-type=git) Feb 20 03:39:57 localhost podman[95277]: 2026-02-20 08:39:57.150896245 +0000 UTC m=+0.088668941 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:39:57 localhost podman[95277]: unhealthy Feb 20 03:39:57 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:39:57 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:39:57 localhost systemd[1]: tmp-crun.vAJkNJ.mount: Deactivated successfully. Feb 20 03:39:57 localhost podman[95278]: 2026-02-20 08:39:57.204797931 +0000 UTC m=+0.139488712 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:39:57 localhost podman[95278]: 2026-02-20 08:39:57.21187166 +0000 UTC m=+0.146562431 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git) Feb 20 03:39:57 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:39:57 localhost podman[95279]: 2026-02-20 08:39:57.251717521 +0000 UTC m=+0.183529093 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_metadata_agent) Feb 20 03:39:57 localhost podman[95279]: 2026-02-20 08:39:57.271360458 +0000 UTC m=+0.203172100 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:39:57 localhost podman[95280]: 2026-02-20 08:39:57.307163734 +0000 UTC m=+0.235547131 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 20 03:39:57 localhost podman[95279]: unhealthy Feb 20 03:39:57 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:39:57 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:39:57 localhost podman[95280]: 2026-02-20 08:39:57.343986002 +0000 UTC m=+0.272369409 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Feb 20 03:39:57 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:40:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:40:02 localhost systemd[1]: tmp-crun.o3o9JG.mount: Deactivated successfully. Feb 20 03:40:02 localhost podman[95361]: 2026-02-20 08:40:02.168512938 +0000 UTC m=+0.096429712 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:40:02 localhost podman[95361]: 2026-02-20 08:40:02.52099134 +0000 UTC m=+0.448908064 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, release=1766032510, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team) Feb 20 03:40:02 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:40:10 localhost podman[95517]: Feb 20 03:40:10 localhost podman[95517]: 2026-02-20 08:40:10.781207863 +0000 UTC m=+0.085749900 container create 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, RELEASE=main, vendor=Red Hat, Inc., release=1770267347, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, version=7) Feb 20 03:40:10 localhost systemd[1]: Started libpod-conmon-81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6.scope. Feb 20 03:40:10 localhost podman[95517]: 2026-02-20 08:40:10.743021503 +0000 UTC m=+0.047563580 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 03:40:10 localhost systemd[1]: Started libcrun container. Feb 20 03:40:10 localhost podman[95517]: 2026-02-20 08:40:10.865182388 +0000 UTC m=+0.169724425 container init 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, release=1770267347) Feb 20 03:40:10 localhost systemd[1]: tmp-crun.p4Vysq.mount: Deactivated successfully. Feb 20 03:40:10 localhost podman[95517]: 2026-02-20 08:40:10.882134902 +0000 UTC m=+0.186676939 container start 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.buildah.version=1.42.2, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, build-date=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 03:40:10 localhost podman[95517]: 2026-02-20 08:40:10.88241478 +0000 UTC m=+0.186956817 container attach 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, ceph=True, version=7, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Feb 20 03:40:10 localhost systemd[1]: libpod-81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6.scope: Deactivated successfully. Feb 20 03:40:10 localhost affectionate_carson[95532]: 167 167 Feb 20 03:40:10 localhost podman[95517]: 2026-02-20 08:40:10.886950151 +0000 UTC m=+0.191492198 container died 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.42.2, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public) Feb 20 03:40:10 localhost podman[95537]: 2026-02-20 08:40:10.97948537 +0000 UTC m=+0.081248021 container remove 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, RELEASE=main, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True) Feb 20 03:40:10 localhost systemd[1]: libpod-conmon-81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6.scope: Deactivated successfully. Feb 20 03:40:11 localhost podman[95559]: Feb 20 03:40:11 localhost podman[95559]: 2026-02-20 08:40:11.20503135 +0000 UTC m=+0.070721837 container create 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, version=7, GIT_BRANCH=main, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main) Feb 20 03:40:11 localhost systemd[1]: Started libpod-conmon-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope. Feb 20 03:40:11 localhost systemd[1]: Started libcrun container. Feb 20 03:40:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 03:40:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 03:40:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 03:40:11 localhost podman[95559]: 2026-02-20 08:40:11.179846012 +0000 UTC m=+0.045536599 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 03:40:11 localhost podman[95559]: 2026-02-20 08:40:11.28237393 +0000 UTC m=+0.148064397 container init 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:40:11 localhost podman[95559]: 2026-02-20 08:40:11.289368406 +0000 UTC m=+0.155058883 container start 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, build-date=2026-02-09T10:25:24Z, vcs-type=git, RELEASE=main, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.42.2, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux , distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph) Feb 20 03:40:11 localhost podman[95559]: 2026-02-20 08:40:11.289537012 +0000 UTC m=+0.155227509 container attach 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Feb 20 03:40:11 localhost systemd[1]: var-lib-containers-storage-overlay-ba5b980df86d8244b5e3c3b4fdfad53b098ba29b6a857bea57aa5794c7e56fd7-merged.mount: Deactivated successfully. Feb 20 03:40:12 localhost amazing_clarke[95575]: [ Feb 20 03:40:12 localhost amazing_clarke[95575]: { Feb 20 03:40:12 localhost amazing_clarke[95575]: "available": false, Feb 20 03:40:12 localhost amazing_clarke[95575]: "ceph_device": false, Feb 20 03:40:12 localhost amazing_clarke[95575]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 20 03:40:12 localhost amazing_clarke[95575]: "lsm_data": {}, Feb 20 03:40:12 localhost amazing_clarke[95575]: "lvs": [], Feb 20 03:40:12 localhost amazing_clarke[95575]: "path": "/dev/sr0", Feb 20 03:40:12 localhost amazing_clarke[95575]: "rejected_reasons": [ Feb 20 03:40:12 localhost amazing_clarke[95575]: "Insufficient space (<5GB)", Feb 20 03:40:12 localhost amazing_clarke[95575]: "Has a FileSystem" Feb 20 03:40:12 localhost amazing_clarke[95575]: ], Feb 20 03:40:12 localhost amazing_clarke[95575]: "sys_api": { Feb 20 03:40:12 localhost amazing_clarke[95575]: "actuators": null, Feb 20 03:40:12 localhost amazing_clarke[95575]: "device_nodes": "sr0", Feb 20 03:40:12 localhost amazing_clarke[95575]: "human_readable_size": "482.00 KB", Feb 20 03:40:12 localhost amazing_clarke[95575]: "id_bus": "ata", Feb 20 03:40:12 localhost amazing_clarke[95575]: "model": "QEMU DVD-ROM", Feb 20 03:40:12 localhost amazing_clarke[95575]: "nr_requests": "2", Feb 20 03:40:12 localhost amazing_clarke[95575]: "partitions": {}, Feb 20 03:40:12 localhost amazing_clarke[95575]: "path": "/dev/sr0", Feb 20 03:40:12 localhost amazing_clarke[95575]: "removable": "1", Feb 20 03:40:12 localhost amazing_clarke[95575]: "rev": "2.5+", Feb 20 03:40:12 localhost amazing_clarke[95575]: "ro": "0", Feb 20 03:40:12 localhost amazing_clarke[95575]: "rotational": "1", Feb 20 03:40:12 localhost amazing_clarke[95575]: "sas_address": "", Feb 20 03:40:12 localhost amazing_clarke[95575]: "sas_device_handle": "", Feb 20 03:40:12 localhost amazing_clarke[95575]: "scheduler_mode": "mq-deadline", Feb 20 03:40:12 localhost amazing_clarke[95575]: "sectors": 0, Feb 20 03:40:12 localhost amazing_clarke[95575]: "sectorsize": "2048", Feb 20 03:40:12 localhost amazing_clarke[95575]: "size": 493568.0, Feb 20 03:40:12 localhost amazing_clarke[95575]: "support_discard": "0", Feb 20 03:40:12 localhost amazing_clarke[95575]: "type": "disk", Feb 20 03:40:12 localhost amazing_clarke[95575]: "vendor": "QEMU" Feb 20 03:40:12 localhost amazing_clarke[95575]: } Feb 20 03:40:12 localhost amazing_clarke[95575]: } Feb 20 03:40:12 localhost amazing_clarke[95575]: ] Feb 20 03:40:12 localhost systemd[1]: libpod-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope: Deactivated successfully. Feb 20 03:40:12 localhost systemd[1]: libpod-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope: Consumed 1.007s CPU time. Feb 20 03:40:12 localhost podman[97688]: 2026-02-20 08:40:12.307386045 +0000 UTC m=+0.035235590 container died 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Feb 20 03:40:12 localhost systemd[1]: tmp-crun.Bn8LDt.mount: Deactivated successfully. Feb 20 03:40:12 localhost systemd[1]: var-lib-containers-storage-overlay-6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617-merged.mount: Deactivated successfully. Feb 20 03:40:12 localhost podman[97688]: 2026-02-20 08:40:12.342692086 +0000 UTC m=+0.070541611 container remove 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 03:40:12 localhost systemd[1]: libpod-conmon-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope: Deactivated successfully. Feb 20 03:40:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:40:16 localhost recover_tripleo_nova_virtqemud[97718]: 63005 Feb 20 03:40:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:40:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:40:18 localhost sshd[97719]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:40:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:40:20 localhost podman[97721]: 2026-02-20 08:40:20.160565221 +0000 UTC m=+0.090835388 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Feb 20 03:40:20 localhost podman[97721]: 2026-02-20 08:40:20.373714737 +0000 UTC m=+0.303984894 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, batch=17.1_20260112.1) Feb 20 03:40:20 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:40:21 localhost systemd[1]: tmp-crun.iBgIBg.mount: Deactivated successfully. Feb 20 03:40:21 localhost podman[97751]: 2026-02-20 08:40:21.510214918 +0000 UTC m=+0.104230232 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:40:21 localhost podman[97751]: 2026-02-20 08:40:21.547016285 +0000 UTC m=+0.141031629 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:40:21 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:40:21 localhost podman[97753]: 2026-02-20 08:40:21.593499862 +0000 UTC m=+0.182907753 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:40:21 localhost podman[97756]: 2026-02-20 08:40:21.603705067 +0000 UTC m=+0.186694590 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi) Feb 20 03:40:21 localhost podman[97752]: 2026-02-20 08:40:21.570093729 +0000 UTC m=+0.162107411 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, container_name=collectd, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, build-date=2026-01-12T22:10:15Z) Feb 20 03:40:21 localhost podman[97752]: 2026-02-20 08:40:21.653064672 +0000 UTC m=+0.245078334 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510) Feb 20 03:40:21 localhost podman[97756]: 2026-02-20 08:40:21.660171062 +0000 UTC m=+0.243160585 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, build-date=2026-01-12T23:07:30Z, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:40:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:40:21 localhost podman[97753]: 2026-02-20 08:40:21.672888975 +0000 UTC m=+0.262296856 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510) Feb 20 03:40:21 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:40:21 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:40:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:40:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:40:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:40:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:40:28 localhost podman[97849]: 2026-02-20 08:40:28.15971977 +0000 UTC m=+0.085642598 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step5, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:40:28 localhost podman[97848]: 2026-02-20 08:40:28.211063286 +0000 UTC m=+0.137585843 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, version=17.1.13, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:40:28 localhost podman[97849]: 2026-02-20 08:40:28.21734624 +0000 UTC m=+0.143269148 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:40:28 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:40:28 localhost podman[97846]: 2026-02-20 08:40:28.134355916 +0000 UTC m=+0.070770659 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4) Feb 20 03:40:28 localhost podman[97848]: 2026-02-20 08:40:28.258004077 +0000 UTC m=+0.184526644 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:40:28 localhost podman[97848]: unhealthy Feb 20 03:40:28 localhost podman[97846]: 2026-02-20 08:40:28.264134696 +0000 UTC m=+0.200549479 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 20 03:40:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:40:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:40:28 localhost podman[97846]: unhealthy Feb 20 03:40:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:40:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:40:28 localhost podman[97847]: 2026-02-20 08:40:28.355918792 +0000 UTC m=+0.287650740 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:40:28 localhost podman[97847]: 2026-02-20 08:40:28.370242645 +0000 UTC m=+0.301974623 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:40:28 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:40:32 localhost sshd[97932]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:40:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:40:32 localhost systemd[1]: tmp-crun.AxbY6a.mount: Deactivated successfully. Feb 20 03:40:32 localhost podman[97934]: 2026-02-20 08:40:32.657333714 +0000 UTC m=+0.094764760 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:40:33 localhost podman[97934]: 2026-02-20 08:40:33.02771548 +0000 UTC m=+0.465146586 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, release=1766032510) Feb 20 03:40:33 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:40:50 localhost sshd[97958]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:40:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:40:51 localhost podman[97960]: 2026-02-20 08:40:51.135027331 +0000 UTC m=+0.076700731 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:40:51 localhost podman[97960]: 2026-02-20 08:40:51.356985199 +0000 UTC m=+0.298658529 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13) Feb 20 03:40:51 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:40:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:40:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:40:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:40:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:40:52 localhost podman[97988]: 2026-02-20 08:40:52.16537745 +0000 UTC m=+0.098533915 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:30Z, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 20 03:40:52 localhost sshd[98040]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:40:52 localhost podman[97988]: 2026-02-20 08:40:52.198123193 +0000 UTC m=+0.131279618 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:40:52 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:40:52 localhost podman[97996]: 2026-02-20 08:40:52.226586602 +0000 UTC m=+0.148457889 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true) Feb 20 03:40:52 localhost podman[97990]: 2026-02-20 08:40:52.272205261 +0000 UTC m=+0.197543685 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 20 03:40:52 localhost podman[97990]: 2026-02-20 08:40:52.310056472 +0000 UTC m=+0.235394886 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:40:52 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:40:52 localhost podman[97989]: 2026-02-20 08:40:52.322726513 +0000 UTC m=+0.252943568 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5) Feb 20 03:40:52 localhost podman[97989]: 2026-02-20 08:40:52.337951864 +0000 UTC m=+0.268168909 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Feb 20 03:40:52 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:40:52 localhost podman[97996]: 2026-02-20 08:40:52.39478945 +0000 UTC m=+0.316660737 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:40:52 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:40:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:40:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:40:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:40:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:40:59 localhost podman[98088]: 2026-02-20 08:40:59.162519726 +0000 UTC m=+0.092417677 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z) Feb 20 03:40:59 localhost systemd[1]: tmp-crun.pnf8TQ.mount: Deactivated successfully. Feb 20 03:40:59 localhost podman[98085]: 2026-02-20 08:40:59.221851279 +0000 UTC m=+0.153400972 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 20 03:40:59 localhost podman[98088]: 2026-02-20 08:40:59.241662211 +0000 UTC m=+0.171560202 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 20 03:40:59 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:40:59 localhost podman[98085]: 2026-02-20 08:40:59.262880297 +0000 UTC m=+0.194430020 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:40:59 localhost podman[98085]: unhealthy Feb 20 03:40:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:40:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:40:59 localhost podman[98084]: 2026-02-20 08:40:59.312129768 +0000 UTC m=+0.247030754 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:40:59 localhost podman[98084]: 2026-02-20 08:40:59.349014218 +0000 UTC m=+0.283915264 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1) Feb 20 03:40:59 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:40:59 localhost podman[98083]: 2026-02-20 08:40:59.361938718 +0000 UTC m=+0.300964492 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 20 03:40:59 localhost podman[98083]: 2026-02-20 08:40:59.376997333 +0000 UTC m=+0.316023137 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:40:59 localhost podman[98083]: unhealthy Feb 20 03:40:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:40:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:41:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:41:04 localhost podman[98167]: 2026-02-20 08:41:04.139777182 +0000 UTC m=+0.078816897 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:41:04 localhost podman[98167]: 2026-02-20 08:41:04.532093865 +0000 UTC m=+0.471133560 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510) Feb 20 03:41:04 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:41:17 localhost sshd[98319]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:41:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:41:22 localhost systemd[1]: tmp-crun.POIA64.mount: Deactivated successfully. Feb 20 03:41:22 localhost podman[98321]: 2026-02-20 08:41:22.155360438 +0000 UTC m=+0.088181096 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:41:22 localhost podman[98321]: 2026-02-20 08:41:22.354185142 +0000 UTC m=+0.287005840 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:41:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:41:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:41:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:41:22 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:41:22 localhost systemd[1]: tmp-crun.6SUXfL.mount: Deactivated successfully. Feb 20 03:41:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:41:22 localhost podman[98350]: 2026-02-20 08:41:22.509998727 +0000 UTC m=+0.127551923 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T23:07:30Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13) Feb 20 03:41:22 localhost podman[98351]: 2026-02-20 08:41:22.556397971 +0000 UTC m=+0.166469295 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond) Feb 20 03:41:22 localhost podman[98350]: 2026-02-20 08:41:22.58906526 +0000 UTC m=+0.206618426 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z) Feb 20 03:41:22 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:41:22 localhost podman[98357]: 2026-02-20 08:41:22.601724272 +0000 UTC m=+0.209494205 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., tcib_managed=true) Feb 20 03:41:22 localhost podman[98357]: 2026-02-20 08:41:22.613800935 +0000 UTC m=+0.221570888 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.) Feb 20 03:41:22 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:41:22 localhost podman[98351]: 2026-02-20 08:41:22.641021346 +0000 UTC m=+0.251092690 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:41:22 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:41:22 localhost podman[98388]: 2026-02-20 08:41:22.660442966 +0000 UTC m=+0.143677751 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute) Feb 20 03:41:22 localhost podman[98388]: 2026-02-20 08:41:22.692999032 +0000 UTC m=+0.176233847 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, container_name=ceilometer_agent_compute, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc.) Feb 20 03:41:22 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:41:30 localhost systemd[1]: tmp-crun.8l4GVS.mount: Deactivated successfully. Feb 20 03:41:30 localhost podman[98445]: 2026-02-20 08:41:30.151073471 +0000 UTC m=+0.087744403 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public) Feb 20 03:41:30 localhost podman[98445]: 2026-02-20 08:41:30.195602257 +0000 UTC m=+0.132273139 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510) Feb 20 03:41:30 localhost podman[98445]: unhealthy Feb 20 03:41:30 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:41:30 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:41:30 localhost podman[98446]: 2026-02-20 08:41:30.208615959 +0000 UTC m=+0.139169032 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:41:30 localhost podman[98447]: 2026-02-20 08:41:30.17758087 +0000 UTC m=+0.105249874 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z) Feb 20 03:41:30 localhost podman[98446]: 2026-02-20 08:41:30.245004924 +0000 UTC m=+0.175558037 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:41:30 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:41:30 localhost podman[98447]: 2026-02-20 08:41:30.266994423 +0000 UTC m=+0.194663427 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, batch=17.1_20260112.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git) Feb 20 03:41:30 localhost podman[98447]: unhealthy Feb 20 03:41:30 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:41:30 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:41:30 localhost podman[98448]: 2026-02-20 08:41:30.313739608 +0000 UTC m=+0.238300525 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:41:30 localhost podman[98448]: 2026-02-20 08:41:30.366095755 +0000 UTC m=+0.290656642 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:41:30 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:41:30 localhost sshd[98527]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:41:30 localhost sshd[98528]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:41:31 localhost systemd[1]: tmp-crun.juDpTM.mount: Deactivated successfully. Feb 20 03:41:32 localhost sshd[98529]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:41:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:41:35 localhost podman[98531]: 2026-02-20 08:41:35.155879848 +0000 UTC m=+0.094400978 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:41:35 localhost podman[98531]: 2026-02-20 08:41:35.524999435 +0000 UTC m=+0.463520575 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible) Feb 20 03:41:35 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:41:53 localhost systemd[1]: tmp-crun.aOqJYB.mount: Deactivated successfully. Feb 20 03:41:53 localhost podman[98554]: 2026-02-20 08:41:53.175611975 +0000 UTC m=+0.100479636 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, config_id=tripleo_step4) Feb 20 03:41:53 localhost podman[98554]: 2026-02-20 08:41:53.209895364 +0000 UTC m=+0.134763025 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 20 03:41:53 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:41:53 localhost podman[98553]: 2026-02-20 08:41:53.221776831 +0000 UTC m=+0.150032308 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:41:53 localhost podman[98555]: 2026-02-20 08:41:53.314804946 +0000 UTC m=+0.239372868 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:41:53 localhost podman[98556]: 2026-02-20 08:41:53.156823934 +0000 UTC m=+0.082100679 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Feb 20 03:41:53 localhost podman[98553]: 2026-02-20 08:41:53.321310327 +0000 UTC m=+0.249565774 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64) Feb 20 03:41:53 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:41:53 localhost podman[98555]: 2026-02-20 08:41:53.375028027 +0000 UTC m=+0.299595999 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step3, container_name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:41:53 localhost podman[98557]: 2026-02-20 08:41:53.290200135 +0000 UTC m=+0.211148336 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 20 03:41:53 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:41:53 localhost podman[98556]: 2026-02-20 08:41:53.397884243 +0000 UTC m=+0.323160998 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z) Feb 20 03:41:53 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:41:53 localhost podman[98557]: 2026-02-20 08:41:53.508898043 +0000 UTC m=+0.429846224 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible) Feb 20 03:41:53 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:42:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:42:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:42:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:42:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:42:01 localhost systemd[1]: tmp-crun.ux8NCJ.mount: Deactivated successfully. Feb 20 03:42:01 localhost podman[98670]: 2026-02-20 08:42:01.174591126 +0000 UTC m=+0.104988576 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Feb 20 03:42:01 localhost podman[98669]: 2026-02-20 08:42:01.14072456 +0000 UTC m=+0.077905199 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.13, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public) Feb 20 03:42:01 localhost podman[98668]: 2026-02-20 08:42:01.197007499 +0000 UTC m=+0.134007092 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 20 03:42:01 localhost podman[98668]: 2026-02-20 08:42:01.213894151 +0000 UTC m=+0.150893744 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 20 03:42:01 localhost podman[98668]: unhealthy Feb 20 03:42:01 localhost podman[98669]: 2026-02-20 08:42:01.220253397 +0000 UTC m=+0.157434016 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com) Feb 20 03:42:01 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:42:01 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:42:01 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:42:01 localhost podman[98670]: 2026-02-20 08:42:01.311316191 +0000 UTC m=+0.241713591 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:42:01 localhost podman[98670]: unhealthy Feb 20 03:42:01 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:42:01 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:42:01 localhost podman[98671]: 2026-02-20 08:42:01.36176552 +0000 UTC m=+0.290309672 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step5) Feb 20 03:42:01 localhost podman[98671]: 2026-02-20 08:42:01.389685823 +0000 UTC m=+0.318229965 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Feb 20 03:42:01 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:42:01 localhost sshd[98754]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:42:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:42:06 localhost podman[98756]: 2026-02-20 08:42:06.148698785 +0000 UTC m=+0.088130604 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:42:06 localhost podman[98756]: 2026-02-20 08:42:06.523154126 +0000 UTC m=+0.462585995 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true) Feb 20 03:42:06 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:42:15 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:42:15 localhost recover_tripleo_nova_virtqemud[98795]: 63005 Feb 20 03:42:15 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:42:15 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:42:24 localhost podman[98861]: 2026-02-20 08:42:24.146088049 +0000 UTC m=+0.071411659 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc.) Feb 20 03:42:24 localhost podman[98857]: 2026-02-20 08:42:24.207741284 +0000 UTC m=+0.138276144 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:42:24 localhost podman[98858]: 2026-02-20 08:42:24.246530423 +0000 UTC m=+0.174024099 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container) Feb 20 03:42:24 localhost podman[98858]: 2026-02-20 08:42:24.257057587 +0000 UTC m=+0.184551283 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:42:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:42:24 localhost podman[98860]: 2026-02-20 08:42:24.312496541 +0000 UTC m=+0.237525422 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 20 03:42:24 localhost podman[98857]: 2026-02-20 08:42:24.313024067 +0000 UTC m=+0.243558907 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64) Feb 20 03:42:24 localhost podman[98861]: 2026-02-20 08:42:24.350242128 +0000 UTC m=+0.275565758 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.13, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1) Feb 20 03:42:24 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:42:24 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:42:24 localhost podman[98859]: 2026-02-20 08:42:24.360083961 +0000 UTC m=+0.287787283 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3) Feb 20 03:42:24 localhost podman[98859]: 2026-02-20 08:42:24.45225582 +0000 UTC m=+0.379959252 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git) Feb 20 03:42:24 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:42:24 localhost podman[98860]: 2026-02-20 08:42:24.470917387 +0000 UTC m=+0.395946298 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 20 03:42:24 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:42:32 localhost podman[98976]: 2026-02-20 08:42:32.152120382 +0000 UTC m=+0.084517792 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:42:32 localhost podman[98976]: 2026-02-20 08:42:32.16853853 +0000 UTC m=+0.100935900 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container) Feb 20 03:42:32 localhost podman[98976]: unhealthy Feb 20 03:42:32 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:42:32 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:42:32 localhost podman[98977]: 2026-02-20 08:42:32.251800462 +0000 UTC m=+0.182580293 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13) Feb 20 03:42:32 localhost podman[98977]: 2026-02-20 08:42:32.262183533 +0000 UTC m=+0.192963404 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, release=1766032510) Feb 20 03:42:32 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:42:32 localhost podman[98978]: 2026-02-20 08:42:32.315751228 +0000 UTC m=+0.244708092 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:42:32 localhost podman[98978]: 2026-02-20 08:42:32.362148693 +0000 UTC m=+0.291105477 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team) Feb 20 03:42:32 localhost podman[98978]: unhealthy Feb 20 03:42:32 localhost podman[98979]: 2026-02-20 08:42:32.373358979 +0000 UTC m=+0.300187587 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, vcs-type=git, container_name=nova_compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 20 03:42:32 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:42:32 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:42:32 localhost podman[98979]: 2026-02-20 08:42:32.402157239 +0000 UTC m=+0.328985867 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step5) Feb 20 03:42:32 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:42:35 localhost sshd[99053]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:42:36 localhost podman[99055]: 2026-02-20 08:42:36.819197265 +0000 UTC m=+0.087166215 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:42:37 localhost podman[99055]: 2026-02-20 08:42:37.193771279 +0000 UTC m=+0.461740249 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:42:37 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:42:44 localhost sshd[99078]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:42:55 localhost systemd[1]: tmp-crun.QA6ySW.mount: Deactivated successfully. Feb 20 03:42:55 localhost podman[99080]: 2026-02-20 08:42:55.13345043 +0000 UTC m=+0.071440588 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64) Feb 20 03:42:55 localhost podman[99081]: 2026-02-20 08:42:55.187374347 +0000 UTC m=+0.120674921 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:42:55 localhost podman[99080]: 2026-02-20 08:42:55.212690639 +0000 UTC m=+0.150680817 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510) Feb 20 03:42:55 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:42:55 localhost podman[99094]: 2026-02-20 08:42:55.260857978 +0000 UTC m=+0.181095928 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:42:55 localhost podman[99081]: 2026-02-20 08:42:55.266586255 +0000 UTC m=+0.199886879 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 20 03:42:55 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:42:55 localhost podman[99086]: 2026-02-20 08:42:55.162690554 +0000 UTC m=+0.089666522 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=) Feb 20 03:42:55 localhost podman[99088]: 2026-02-20 08:42:55.318212129 +0000 UTC m=+0.242524294 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=) Feb 20 03:42:55 localhost podman[99088]: 2026-02-20 08:42:55.346959527 +0000 UTC m=+0.271271703 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:42:55 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:42:55 localhost podman[99086]: 2026-02-20 08:42:55.397424478 +0000 UTC m=+0.324400526 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step3, url=https://www.redhat.com) Feb 20 03:42:55 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:42:55 localhost podman[99094]: 2026-02-20 08:42:55.463151669 +0000 UTC m=+0.383389669 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:42:55 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:43:01 localhost sshd[99198]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:43:02 localhost systemd[1]: tmp-crun.um9af3.mount: Deactivated successfully. Feb 20 03:43:02 localhost systemd[1]: tmp-crun.lO3DaM.mount: Deactivated successfully. Feb 20 03:43:02 localhost podman[99201]: 2026-02-20 08:43:02.656221608 +0000 UTC m=+0.068789628 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510) Feb 20 03:43:02 localhost podman[99202]: 2026-02-20 08:43:02.726413327 +0000 UTC m=+0.134505938 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:43:02 localhost podman[99201]: 2026-02-20 08:43:02.745192606 +0000 UTC m=+0.157760666 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, container_name=iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64) Feb 20 03:43:02 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:43:02 localhost podman[99202]: 2026-02-20 08:43:02.79642334 +0000 UTC m=+0.204515941 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:43:02 localhost podman[99202]: unhealthy Feb 20 03:43:02 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:43:02 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:43:02 localhost podman[99200]: 2026-02-20 08:43:02.697930707 +0000 UTC m=+0.111435115 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=ovn_controller, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.13) Feb 20 03:43:02 localhost podman[99203]: 2026-02-20 08:43:02.885890765 +0000 UTC m=+0.289465057 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:43:02 localhost podman[99200]: 2026-02-20 08:43:02.932292348 +0000 UTC m=+0.345796806 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true) Feb 20 03:43:02 localhost podman[99200]: unhealthy Feb 20 03:43:02 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:43:02 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:43:02 localhost sshd[99281]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:43:02 localhost podman[99203]: 2026-02-20 08:43:02.989595959 +0000 UTC m=+0.393170231 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5) Feb 20 03:43:03 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:43:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:43:08 localhost systemd[1]: tmp-crun.FAMDnz.mount: Deactivated successfully. Feb 20 03:43:08 localhost podman[99285]: 2026-02-20 08:43:08.159975123 +0000 UTC m=+0.095429960 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:43:08 localhost podman[99285]: 2026-02-20 08:43:08.530518564 +0000 UTC m=+0.465973361 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=) Feb 20 03:43:08 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:43:17 localhost podman[99414]: 2026-02-20 08:43:17.924155684 +0000 UTC m=+0.085338488 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=) Feb 20 03:43:18 localhost podman[99414]: 2026-02-20 08:43:18.031888893 +0000 UTC m=+0.193071757 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, architecture=x86_64, release=1770267347, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph) Feb 20 03:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:43:26 localhost systemd[1]: tmp-crun.29YShZ.mount: Deactivated successfully. Feb 20 03:43:26 localhost systemd[1]: tmp-crun.1CSJfB.mount: Deactivated successfully. Feb 20 03:43:26 localhost podman[99557]: 2026-02-20 08:43:26.142143473 +0000 UTC m=+0.078327061 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:43:26 localhost podman[99560]: 2026-02-20 08:43:26.175645729 +0000 UTC m=+0.103658224 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Feb 20 03:43:26 localhost podman[99558]: 2026-02-20 08:43:26.235775657 +0000 UTC m=+0.172739919 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5) Feb 20 03:43:26 localhost podman[99558]: 2026-02-20 08:43:26.246404276 +0000 UTC m=+0.183368568 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4) Feb 20 03:43:26 localhost podman[99560]: 2026-02-20 08:43:26.260079738 +0000 UTC m=+0.188092263 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:43:26 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:43:26 localhost podman[99557]: 2026-02-20 08:43:26.278286671 +0000 UTC m=+0.214470289 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, version=17.1.13, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1) Feb 20 03:43:26 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:43:26 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:43:26 localhost podman[99559]: 2026-02-20 08:43:26.195964517 +0000 UTC m=+0.131498314 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z) Feb 20 03:43:26 localhost podman[99559]: 2026-02-20 08:43:26.325773628 +0000 UTC m=+0.261307435 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.) Feb 20 03:43:26 localhost podman[99561]: 2026-02-20 08:43:26.338534842 +0000 UTC m=+0.271137309 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 20 03:43:26 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:43:26 localhost podman[99561]: 2026-02-20 08:43:26.532058092 +0000 UTC m=+0.464660589 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5) Feb 20 03:43:26 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:43:30 localhost sshd[99677]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:43:33 localhost podman[99679]: 2026-02-20 08:43:33.15684785 +0000 UTC m=+0.095008978 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 20 03:43:33 localhost podman[99679]: 2026-02-20 08:43:33.197003961 +0000 UTC m=+0.135165099 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:43:33 localhost podman[99679]: unhealthy Feb 20 03:43:33 localhost systemd[1]: tmp-crun.bNiCFW.mount: Deactivated successfully. Feb 20 03:43:33 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:43:33 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:43:33 localhost podman[99680]: 2026-02-20 08:43:33.225216823 +0000 UTC m=+0.158666905 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:43:33 localhost podman[99680]: 2026-02-20 08:43:33.260115681 +0000 UTC m=+0.193565703 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1766032510, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z) Feb 20 03:43:33 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:43:33 localhost podman[99681]: 2026-02-20 08:43:33.272604157 +0000 UTC m=+0.204550572 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, version=17.1.13, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:43:33 localhost podman[99681]: 2026-02-20 08:43:33.310376674 +0000 UTC m=+0.242323129 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:43:33 localhost podman[99681]: unhealthy Feb 20 03:43:33 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:43:33 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:43:33 localhost podman[99682]: 2026-02-20 08:43:33.360010478 +0000 UTC m=+0.288085943 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:43:33 localhost podman[99682]: 2026-02-20 08:43:33.385050522 +0000 UTC m=+0.313126007 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:43:33 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:43:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:43:39 localhost systemd[1]: tmp-crun.NPBO40.mount: Deactivated successfully. Feb 20 03:43:39 localhost podman[99770]: 2026-02-20 08:43:39.159177992 +0000 UTC m=+0.096007317 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:43:39 localhost podman[99770]: 2026-02-20 08:43:39.550459504 +0000 UTC m=+0.487288789 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 20 03:43:39 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:43:49 localhost sshd[99793]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:43:49 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:43:50 localhost recover_tripleo_nova_virtqemud[99796]: 63005 Feb 20 03:43:50 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:43:50 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:43:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:43:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:43:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:43:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:43:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:43:57 localhost podman[99801]: 2026-02-20 08:43:57.137916609 +0000 UTC m=+0.072314275 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:43:57 localhost podman[99797]: 2026-02-20 08:43:57.174524331 +0000 UTC m=+0.114068237 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z) Feb 20 03:43:57 localhost podman[99800]: 2026-02-20 08:43:57.186549202 +0000 UTC m=+0.121087303 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=) Feb 20 03:43:57 localhost podman[99800]: 2026-02-20 08:43:57.204821707 +0000 UTC m=+0.139359778 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 20 03:43:57 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:43:57 localhost podman[99798]: 2026-02-20 08:43:57.122445681 +0000 UTC m=+0.063486092 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z) Feb 20 03:43:57 localhost podman[99798]: 2026-02-20 08:43:57.255884935 +0000 UTC m=+0.196925366 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container) Feb 20 03:43:57 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:43:57 localhost podman[99797]: 2026-02-20 08:43:57.274595512 +0000 UTC m=+0.214139438 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:43:57 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:43:57 localhost podman[99799]: 2026-02-20 08:43:57.34991082 +0000 UTC m=+0.290038363 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 20 03:43:57 localhost podman[99801]: 2026-02-20 08:43:57.386973126 +0000 UTC m=+0.321370752 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:43:57 localhost podman[99799]: 2026-02-20 08:43:57.38743718 +0000 UTC m=+0.327564723 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible) Feb 20 03:43:57 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:43:57 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:43:58 localhost systemd[1]: tmp-crun.8l1bOS.mount: Deactivated successfully. Feb 20 03:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:44:04 localhost systemd[1]: tmp-crun.IzS6rz.mount: Deactivated successfully. Feb 20 03:44:04 localhost podman[99920]: 2026-02-20 08:44:04.224202608 +0000 UTC m=+0.143672670 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 20 03:44:04 localhost podman[99915]: 2026-02-20 08:44:04.198737191 +0000 UTC m=+0.125461688 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:44:04 localhost podman[99914]: 2026-02-20 08:44:04.264448953 +0000 UTC m=+0.191690806 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, version=17.1.13, release=1766032510, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public) Feb 20 03:44:04 localhost podman[99915]: 2026-02-20 08:44:04.282004845 +0000 UTC m=+0.208729322 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 20 03:44:04 localhost podman[99913]: 2026-02-20 08:44:04.17702094 +0000 UTC m=+0.106190601 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64) Feb 20 03:44:04 localhost podman[99915]: unhealthy Feb 20 03:44:04 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:44:04 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:44:04 localhost podman[99914]: 2026-02-20 08:44:04.303887231 +0000 UTC m=+0.231129084 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:44:04 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:44:04 localhost podman[99920]: 2026-02-20 08:44:04.354290219 +0000 UTC m=+0.273760301 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:44:04 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:44:04 localhost podman[99913]: 2026-02-20 08:44:04.365705062 +0000 UTC m=+0.294874703 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:44:04 localhost podman[99913]: unhealthy Feb 20 03:44:04 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:44:04 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:44:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:44:10 localhost systemd[1]: tmp-crun.iSw0Cf.mount: Deactivated successfully. Feb 20 03:44:10 localhost podman[99996]: 2026-02-20 08:44:10.142900608 +0000 UTC m=+0.082661215 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:44:10 localhost podman[99996]: 2026-02-20 08:44:10.536977046 +0000 UTC m=+0.476737643 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:44:10 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:44:14 localhost sshd[100020]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:44:26 localhost sshd[100098]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:44:27 localhost systemd[1]: tmp-crun.Zbn6xR.mount: Deactivated successfully. Feb 20 03:44:27 localhost podman[100103]: 2026-02-20 08:44:27.569803913 +0000 UTC m=+0.139653357 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:44:27 localhost podman[100103]: 2026-02-20 08:44:27.578807221 +0000 UTC m=+0.148656675 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-collectd-container) Feb 20 03:44:27 localhost podman[100101]: 2026-02-20 08:44:27.536739251 +0000 UTC m=+0.109040701 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:44:27 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:44:27 localhost podman[100101]: 2026-02-20 08:44:27.621073417 +0000 UTC m=+0.193374837 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron) Feb 20 03:44:27 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:44:27 localhost podman[100102]: 2026-02-20 08:44:27.531127997 +0000 UTC m=+0.103598552 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:44:27 localhost podman[100102]: 2026-02-20 08:44:27.663404445 +0000 UTC m=+0.235874990 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:44:27 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:44:27 localhost podman[100151]: 2026-02-20 08:44:27.711391408 +0000 UTC m=+0.178984622 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr) Feb 20 03:44:27 localhost podman[100100]: 2026-02-20 08:44:27.763028114 +0000 UTC m=+0.335084286 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, tcib_managed=true) Feb 20 03:44:27 localhost podman[100100]: 2026-02-20 08:44:27.791993149 +0000 UTC m=+0.364049281 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:44:27 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:44:27 localhost podman[100151]: 2026-02-20 08:44:27.936009299 +0000 UTC m=+0.403602523 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true) Feb 20 03:44:27 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:44:35 localhost podman[100220]: 2026-02-20 08:44:35.149779398 +0000 UTC m=+0.080268132 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_controller) Feb 20 03:44:35 localhost podman[100220]: 2026-02-20 08:44:35.160925192 +0000 UTC m=+0.091413986 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 20 03:44:35 localhost podman[100220]: unhealthy Feb 20 03:44:35 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:44:35 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:44:35 localhost podman[100223]: 2026-02-20 08:44:35.248491528 +0000 UTC m=+0.175015590 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, distribution-scope=public, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:44:35 localhost podman[100222]: 2026-02-20 08:44:35.295587594 +0000 UTC m=+0.221078883 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13) Feb 20 03:44:35 localhost podman[100222]: 2026-02-20 08:44:35.312010371 +0000 UTC m=+0.237501670 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc.) Feb 20 03:44:35 localhost podman[100222]: unhealthy Feb 20 03:44:35 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:44:35 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:44:35 localhost podman[100223]: 2026-02-20 08:44:35.329662587 +0000 UTC m=+0.256186649 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_id=tripleo_step5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:44:35 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:44:35 localhost podman[100221]: 2026-02-20 08:44:35.400607799 +0000 UTC m=+0.329100691 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid) Feb 20 03:44:35 localhost podman[100221]: 2026-02-20 08:44:35.438105218 +0000 UTC m=+0.366598100 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=) Feb 20 03:44:35 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:44:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:44:41 localhost systemd[1]: tmp-crun.V0WMSn.mount: Deactivated successfully. Feb 20 03:44:41 localhost podman[100304]: 2026-02-20 08:44:41.141908036 +0000 UTC m=+0.081730527 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:44:41 localhost podman[100304]: 2026-02-20 08:44:41.472106159 +0000 UTC m=+0.411928640 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=nova_migration_target) Feb 20 03:44:41 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:44:58 localhost systemd[1]: tmp-crun.MSrB0o.mount: Deactivated successfully. Feb 20 03:44:58 localhost podman[100328]: 2026-02-20 08:44:58.203694275 +0000 UTC m=+0.137106427 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, config_id=tripleo_step4) Feb 20 03:44:58 localhost podman[100328]: 2026-02-20 08:44:58.210856617 +0000 UTC m=+0.144268759 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, version=17.1.13, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, release=1766032510) Feb 20 03:44:58 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:44:58 localhost podman[100327]: 2026-02-20 08:44:58.293747388 +0000 UTC m=+0.230885505 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public) Feb 20 03:44:58 localhost podman[100335]: 2026-02-20 08:44:58.250628316 +0000 UTC m=+0.175842885 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, vendor=Red Hat, Inc.) Feb 20 03:44:58 localhost podman[100327]: 2026-02-20 08:44:58.322211207 +0000 UTC m=+0.259349374 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:44:58 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:44:58 localhost podman[100335]: 2026-02-20 08:44:58.336035655 +0000 UTC m=+0.261250264 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13) Feb 20 03:44:58 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:44:58 localhost podman[100329]: 2026-02-20 08:44:58.177192797 +0000 UTC m=+0.106366458 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, release=1766032510, distribution-scope=public) Feb 20 03:44:58 localhost podman[100341]: 2026-02-20 08:44:58.278384904 +0000 UTC m=+0.199444824 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc.) Feb 20 03:44:58 localhost podman[100341]: 2026-02-20 08:44:58.445039363 +0000 UTC m=+0.366099213 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:44:58 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:44:58 localhost podman[100329]: 2026-02-20 08:44:58.461991017 +0000 UTC m=+0.391164718 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.5) Feb 20 03:44:58 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:44:59 localhost sshd[100450]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:45:06 localhost systemd[1]: tmp-crun.o80s7g.mount: Deactivated successfully. Feb 20 03:45:06 localhost podman[100452]: 2026-02-20 08:45:06.160808725 +0000 UTC m=+0.098858566 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 20 03:45:06 localhost podman[100452]: 2026-02-20 08:45:06.178219753 +0000 UTC m=+0.116269604 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_controller, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc.) Feb 20 03:45:06 localhost podman[100454]: 2026-02-20 08:45:06.178220063 +0000 UTC m=+0.108107782 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:45:06 localhost podman[100452]: unhealthy Feb 20 03:45:06 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:45:06 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:45:06 localhost podman[100453]: 2026-02-20 08:45:06.24187383 +0000 UTC m=+0.174441312 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:45:06 localhost podman[100453]: 2026-02-20 08:45:06.248562687 +0000 UTC m=+0.181130159 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:45:06 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:45:06 localhost podman[100454]: 2026-02-20 08:45:06.26128737 +0000 UTC m=+0.191175139 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:45:06 localhost podman[100454]: unhealthy Feb 20 03:45:06 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:45:06 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:45:06 localhost podman[100460]: 2026-02-20 08:45:06.314713641 +0000 UTC m=+0.238448810 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Feb 20 03:45:06 localhost podman[100460]: 2026-02-20 08:45:06.366709128 +0000 UTC m=+0.290444307 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, container_name=nova_compute, url=https://www.redhat.com) Feb 20 03:45:06 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:45:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:45:12 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:45:12 localhost recover_tripleo_nova_virtqemud[100539]: 63005 Feb 20 03:45:12 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:45:12 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:45:12 localhost podman[100537]: 2026-02-20 08:45:12.152780288 +0000 UTC m=+0.087201525 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 20 03:45:12 localhost podman[100537]: 2026-02-20 08:45:12.551069827 +0000 UTC m=+0.485491014 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 20 03:45:12 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:45:23 localhost sshd[100639]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:45:29 localhost systemd[1]: tmp-crun.WUwYP1.mount: Deactivated successfully. Feb 20 03:45:29 localhost podman[100644]: 2026-02-20 08:45:29.20447822 +0000 UTC m=+0.123175007 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:45:29 localhost podman[100641]: 2026-02-20 08:45:29.16825135 +0000 UTC m=+0.092733536 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:45:29 localhost podman[100643]: 2026-02-20 08:45:29.226010706 +0000 UTC m=+0.147852720 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:45:29 localhost podman[100645]: 2026-02-20 08:45:29.309520276 +0000 UTC m=+0.226372916 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:45:29 localhost podman[100641]: 2026-02-20 08:45:29.322444085 +0000 UTC m=+0.246926301 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com) Feb 20 03:45:29 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:45:29 localhost podman[100642]: 2026-02-20 08:45:29.283863414 +0000 UTC m=+0.207807444 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:45:29 localhost podman[100642]: 2026-02-20 08:45:29.363902917 +0000 UTC m=+0.287846937 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:45:29 localhost podman[100643]: 2026-02-20 08:45:29.373930887 +0000 UTC m=+0.295772971 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, release=1766032510, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:45:29 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:45:29 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:45:29 localhost podman[100644]: 2026-02-20 08:45:29.428887144 +0000 UTC m=+0.347583911 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:45:29 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:45:29 localhost podman[100645]: 2026-02-20 08:45:29.539101101 +0000 UTC m=+0.455953731 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Feb 20 03:45:29 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:45:37 localhost systemd[1]: tmp-crun.Pj2UiB.mount: Deactivated successfully. Feb 20 03:45:37 localhost podman[100763]: 2026-02-20 08:45:37.156453571 +0000 UTC m=+0.088600809 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:45:37 localhost podman[100763]: 2026-02-20 08:45:37.167981147 +0000 UTC m=+0.100128345 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, container_name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public) Feb 20 03:45:37 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:45:37 localhost podman[100765]: 2026-02-20 08:45:37.208070836 +0000 UTC m=+0.130716060 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:45:37 localhost podman[100765]: 2026-02-20 08:45:37.241033465 +0000 UTC m=+0.163678749 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:45:37 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:45:37 localhost podman[100762]: 2026-02-20 08:45:37.248197066 +0000 UTC m=+0.181584982 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com) Feb 20 03:45:37 localhost podman[100764]: 2026-02-20 08:45:37.31628396 +0000 UTC m=+0.240838423 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Feb 20 03:45:37 localhost podman[100762]: 2026-02-20 08:45:37.327495687 +0000 UTC m=+0.260883633 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510) Feb 20 03:45:37 localhost podman[100762]: unhealthy Feb 20 03:45:37 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:45:37 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:45:37 localhost podman[100764]: 2026-02-20 08:45:37.363025235 +0000 UTC m=+0.287579678 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:45:37 localhost podman[100764]: unhealthy Feb 20 03:45:37 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:45:37 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:45:43 localhost sshd[100844]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:45:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:45:44 localhost podman[100846]: 2026-02-20 08:45:44.034585989 +0000 UTC m=+0.086734211 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:45:44 localhost sshd[100868]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:45:44 localhost podman[100846]: 2026-02-20 08:45:44.431302219 +0000 UTC m=+0.483450421 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4) Feb 20 03:45:44 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:45:48 localhost sshd[100870]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:46:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:46:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:46:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:46:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:46:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:46:00 localhost systemd[1]: tmp-crun.cT3iAC.mount: Deactivated successfully. Feb 20 03:46:00 localhost podman[100876]: 2026-02-20 08:46:00.153189314 +0000 UTC m=+0.080400385 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:46:00 localhost podman[100872]: 2026-02-20 08:46:00.215047216 +0000 UTC m=+0.147930743 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:46:00 localhost podman[100875]: 2026-02-20 08:46:00.180896891 +0000 UTC m=+0.102675944 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, container_name=ceilometer_agent_compute) Feb 20 03:46:00 localhost podman[100875]: 2026-02-20 08:46:00.264030159 +0000 UTC m=+0.185809272 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Feb 20 03:46:00 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:46:00 localhost podman[100873]: 2026-02-20 08:46:00.24466191 +0000 UTC m=+0.174714019 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:46:00 localhost podman[100873]: 2026-02-20 08:46:00.327968135 +0000 UTC m=+0.258020224 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git) Feb 20 03:46:00 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:46:00 localhost podman[100876]: 2026-02-20 08:46:00.352773192 +0000 UTC m=+0.279984243 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, version=17.1.13) Feb 20 03:46:00 localhost podman[100872]: 2026-02-20 08:46:00.351145381 +0000 UTC m=+0.284028878 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:46:00 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:46:00 localhost podman[100874]: 2026-02-20 08:46:00.268775536 +0000 UTC m=+0.196569665 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:46:00 localhost podman[100874]: 2026-02-20 08:46:00.399034882 +0000 UTC m=+0.326828961 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd, container_name=collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, architecture=x86_64) Feb 20 03:46:00 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:46:00 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:46:08 localhost podman[100991]: 2026-02-20 08:46:08.153229981 +0000 UTC m=+0.090323763 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:46:08 localhost podman[100991]: 2026-02-20 08:46:08.171059341 +0000 UTC m=+0.108153113 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:46:08 localhost podman[100991]: unhealthy Feb 20 03:46:08 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:46:08 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:46:08 localhost systemd[1]: tmp-crun.kzTDFg.mount: Deactivated successfully. Feb 20 03:46:08 localhost podman[100993]: 2026-02-20 08:46:08.270269757 +0000 UTC m=+0.198559877 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:46:08 localhost podman[100992]: 2026-02-20 08:46:08.300747038 +0000 UTC m=+0.233848686 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step3) Feb 20 03:46:08 localhost podman[100992]: 2026-02-20 08:46:08.310090388 +0000 UTC m=+0.243192046 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=iscsid, config_id=tripleo_step3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64) Feb 20 03:46:08 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:46:08 localhost podman[100999]: 2026-02-20 08:46:08.225915297 +0000 UTC m=+0.148865212 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 20 03:46:08 localhost podman[100999]: 2026-02-20 08:46:08.356669437 +0000 UTC m=+0.279619372 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, version=17.1.13, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:46:08 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:46:08 localhost podman[100993]: 2026-02-20 08:46:08.411407418 +0000 UTC m=+0.339697488 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:46:08 localhost podman[100993]: unhealthy Feb 20 03:46:08 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:46:08 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:46:10 localhost sshd[101076]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:46:15 localhost podman[101078]: 2026-02-20 08:46:15.135543948 +0000 UTC m=+0.069265342 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 20 03:46:15 localhost podman[101078]: 2026-02-20 08:46:15.516522911 +0000 UTC m=+0.450244385 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:46:15 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:46:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:46:23 localhost recover_tripleo_nova_virtqemud[101117]: 63005 Feb 20 03:46:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:46:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:46:28 localhost sshd[101179]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:46:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:46:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:46:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:46:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:46:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:46:31 localhost podman[101183]: 2026-02-20 08:46:31.157451566 +0000 UTC m=+0.084610437 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, container_name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64) Feb 20 03:46:31 localhost podman[101183]: 2026-02-20 08:46:31.168007122 +0000 UTC m=+0.095166023 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, config_id=tripleo_step3) Feb 20 03:46:31 localhost podman[101195]: 2026-02-20 08:46:31.180081475 +0000 UTC m=+0.100043443 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Feb 20 03:46:31 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:46:31 localhost podman[101181]: 2026-02-20 08:46:31.260978677 +0000 UTC m=+0.193131173 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible) Feb 20 03:46:31 localhost podman[101181]: 2026-02-20 08:46:31.310024444 +0000 UTC m=+0.242176960 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13) Feb 20 03:46:31 localhost podman[101182]: 2026-02-20 08:46:31.315371079 +0000 UTC m=+0.247597397 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public) Feb 20 03:46:31 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:46:31 localhost podman[101182]: 2026-02-20 08:46:31.322184389 +0000 UTC m=+0.254410697 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1766032510) Feb 20 03:46:31 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:46:31 localhost podman[101184]: 2026-02-20 08:46:31.364298681 +0000 UTC m=+0.290071879 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:46:31 localhost podman[101195]: 2026-02-20 08:46:31.407094784 +0000 UTC m=+0.327056802 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, release=1766032510, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:46:31 localhost podman[101184]: 2026-02-20 08:46:31.417344241 +0000 UTC m=+0.343117469 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 20 03:46:31 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:46:31 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:46:32 localhost systemd[1]: tmp-crun.xGKa2T.mount: Deactivated successfully. Feb 20 03:46:37 localhost sshd[101298]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:46:38 localhost systemd[1]: tmp-crun.KVAkKb.mount: Deactivated successfully. Feb 20 03:46:38 localhost podman[101301]: 2026-02-20 08:46:38.69433985 +0000 UTC m=+0.085998770 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true) Feb 20 03:46:38 localhost podman[101301]: 2026-02-20 08:46:38.710747538 +0000 UTC m=+0.102406438 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:46:38 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:46:38 localhost podman[101303]: 2026-02-20 08:46:38.790841114 +0000 UTC m=+0.174236498 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:46:38 localhost podman[101302]: 2026-02-20 08:46:38.713126491 +0000 UTC m=+0.096744672 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:46:38 localhost podman[101300]: 2026-02-20 08:46:38.844558694 +0000 UTC m=+0.236344168 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:46:38 localhost podman[101303]: 2026-02-20 08:46:38.851982854 +0000 UTC m=+0.235378148 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z) Feb 20 03:46:38 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:46:38 localhost podman[101300]: 2026-02-20 08:46:38.884978424 +0000 UTC m=+0.276763808 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 20 03:46:38 localhost podman[101300]: unhealthy Feb 20 03:46:38 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:46:38 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:46:38 localhost podman[101302]: 2026-02-20 08:46:38.898038107 +0000 UTC m=+0.281656288 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:46:38 localhost podman[101302]: unhealthy Feb 20 03:46:38 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:46:38 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:46:46 localhost podman[101382]: 2026-02-20 08:46:46.145268314 +0000 UTC m=+0.082118030 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, distribution-scope=public, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:46:46 localhost podman[101382]: 2026-02-20 08:46:46.510143716 +0000 UTC m=+0.446993432 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, release=1766032510, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:46:46 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:46:50 localhost sshd[101405]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:47:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:47:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:47:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:47:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:47:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:47:02 localhost podman[101411]: 2026-02-20 08:47:02.139482298 +0000 UTC m=+0.064768583 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.) Feb 20 03:47:02 localhost systemd[1]: tmp-crun.tSGOav.mount: Deactivated successfully. Feb 20 03:47:02 localhost podman[101420]: 2026-02-20 08:47:02.181856998 +0000 UTC m=+0.098140965 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true) Feb 20 03:47:02 localhost podman[101410]: 2026-02-20 08:47:02.199958347 +0000 UTC m=+0.128616768 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:47:02 localhost podman[101408]: 2026-02-20 08:47:02.201703862 +0000 UTC m=+0.130920890 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z) Feb 20 03:47:02 localhost podman[101409]: 2026-02-20 08:47:02.259112386 +0000 UTC m=+0.186816436 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, container_name=logrotate_crond, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 20 03:47:02 localhost podman[101411]: 2026-02-20 08:47:02.279298031 +0000 UTC m=+0.204584386 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:47:02 localhost podman[101408]: 2026-02-20 08:47:02.281362214 +0000 UTC m=+0.210579232 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 20 03:47:02 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:47:02 localhost podman[101409]: 2026-02-20 08:47:02.292949773 +0000 UTC m=+0.220653833 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:47:02 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:47:02 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:47:02 localhost podman[101420]: 2026-02-20 08:47:02.413770428 +0000 UTC m=+0.330054415 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:47:02 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:47:02 localhost podman[101410]: 2026-02-20 08:47:02.436814511 +0000 UTC m=+0.365472942 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, name=rhosp-rhel9/openstack-collectd) Feb 20 03:47:02 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:47:05 localhost systemd[1]: session-29.scope: Deactivated successfully. Feb 20 03:47:05 localhost systemd[1]: session-29.scope: Consumed 7min 7.963s CPU time. Feb 20 03:47:05 localhost systemd-logind[759]: Session 29 logged out. Waiting for processes to exit. Feb 20 03:47:05 localhost systemd-logind[759]: Removed session 29. Feb 20 03:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:47:09 localhost systemd[1]: tmp-crun.bhG1ie.mount: Deactivated successfully. Feb 20 03:47:09 localhost podman[101525]: 2026-02-20 08:47:09.173799194 +0000 UTC m=+0.105033239 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z) Feb 20 03:47:09 localhost podman[101525]: 2026-02-20 08:47:09.192936135 +0000 UTC m=+0.124170180 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, container_name=ovn_controller, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Feb 20 03:47:09 localhost podman[101527]: 2026-02-20 08:47:09.224693947 +0000 UTC m=+0.149650358 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:47:09 localhost podman[101527]: 2026-02-20 08:47:09.240150064 +0000 UTC m=+0.165106495 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public) Feb 20 03:47:09 localhost podman[101527]: unhealthy Feb 20 03:47:09 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:47:09 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:47:09 localhost podman[101526]: 2026-02-20 08:47:09.322908173 +0000 UTC m=+0.247035318 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:47:09 localhost podman[101526]: 2026-02-20 08:47:09.332011945 +0000 UTC m=+0.256139130 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:47:09 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:47:09 localhost podman[101528]: 2026-02-20 08:47:09.369440853 +0000 UTC m=+0.291460733 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible) Feb 20 03:47:09 localhost podman[101525]: unhealthy Feb 20 03:47:09 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:47:09 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:47:09 localhost podman[101528]: 2026-02-20 08:47:09.448345342 +0000 UTC m=+0.370365272 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5) Feb 20 03:47:09 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:47:10 localhost systemd[1]: tmp-crun.7Dg22G.mount: Deactivated successfully. Feb 20 03:47:13 localhost sshd[101606]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:47:15 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 20 03:47:15 localhost systemd[36724]: Activating special unit Exit the Session... Feb 20 03:47:15 localhost systemd[36724]: Removed slice User Background Tasks Slice. Feb 20 03:47:15 localhost systemd[36724]: Stopped target Main User Target. Feb 20 03:47:15 localhost systemd[36724]: Stopped target Basic System. Feb 20 03:47:15 localhost systemd[36724]: Stopped target Paths. Feb 20 03:47:15 localhost systemd[36724]: Stopped target Sockets. Feb 20 03:47:15 localhost systemd[36724]: Stopped target Timers. Feb 20 03:47:15 localhost systemd[36724]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 20 03:47:15 localhost systemd[36724]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 03:47:15 localhost systemd[36724]: Closed D-Bus User Message Bus Socket. Feb 20 03:47:15 localhost systemd[36724]: Stopped Create User's Volatile Files and Directories. Feb 20 03:47:15 localhost systemd[36724]: Removed slice User Application Slice. Feb 20 03:47:15 localhost systemd[36724]: Reached target Shutdown. Feb 20 03:47:15 localhost systemd[36724]: Finished Exit the Session. Feb 20 03:47:15 localhost systemd[36724]: Reached target Exit the Session. Feb 20 03:47:15 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 20 03:47:15 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 20 03:47:15 localhost systemd[1]: user@1003.service: Consumed 4.783s CPU time, read 0B from disk, written 7.0K to disk. Feb 20 03:47:15 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 20 03:47:15 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 20 03:47:15 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 20 03:47:15 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 20 03:47:15 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 20 03:47:15 localhost systemd[1]: user-1003.slice: Consumed 7min 12.779s CPU time. Feb 20 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:47:17 localhost systemd[1]: tmp-crun.c6Yt0C.mount: Deactivated successfully. Feb 20 03:47:17 localhost podman[101610]: 2026-02-20 08:47:17.148392681 +0000 UTC m=+0.086598998 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, distribution-scope=public, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 20 03:47:17 localhost podman[101610]: 2026-02-20 08:47:17.538110771 +0000 UTC m=+0.476317118 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Feb 20 03:47:17 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:47:27 localhost sshd[101712]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:47:30 localhost sshd[101714]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:47:33 localhost podman[101716]: 2026-02-20 08:47:33.217253192 +0000 UTC m=+0.153180036 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, distribution-scope=public) Feb 20 03:47:33 localhost podman[101717]: 2026-02-20 08:47:33.183017874 +0000 UTC m=+0.116106850 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z) Feb 20 03:47:33 localhost podman[101718]: 2026-02-20 08:47:33.152732207 +0000 UTC m=+0.087064723 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Feb 20 03:47:33 localhost podman[101719]: 2026-02-20 08:47:33.270281412 +0000 UTC m=+0.199865530 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510) Feb 20 03:47:33 localhost podman[101718]: 2026-02-20 08:47:33.283650595 +0000 UTC m=+0.217983131 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z) Feb 20 03:47:33 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:47:33 localhost podman[101719]: 2026-02-20 08:47:33.297930057 +0000 UTC m=+0.227514135 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5) Feb 20 03:47:33 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:47:33 localhost podman[101720]: 2026-02-20 08:47:33.236193268 +0000 UTC m=+0.158341066 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-qdrouterd-container) Feb 20 03:47:33 localhost podman[101717]: 2026-02-20 08:47:33.369343825 +0000 UTC m=+0.302432721 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:47:33 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:47:33 localhost podman[101716]: 2026-02-20 08:47:33.38731314 +0000 UTC m=+0.323239904 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:47:33 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:47:33 localhost podman[101720]: 2026-02-20 08:47:33.434978484 +0000 UTC m=+0.357126322 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Feb 20 03:47:33 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:47:34 localhost systemd[1]: tmp-crun.ZKZ5wt.mount: Deactivated successfully. Feb 20 03:47:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:47:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:47:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:47:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:47:40 localhost systemd[1]: tmp-crun.KGwjfq.mount: Deactivated successfully. Feb 20 03:47:40 localhost systemd[1]: tmp-crun.bPsMEX.mount: Deactivated successfully. Feb 20 03:47:40 localhost podman[101837]: 2026-02-20 08:47:40.219051932 +0000 UTC m=+0.150297798 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, tcib_managed=true) Feb 20 03:47:40 localhost podman[101837]: 2026-02-20 08:47:40.233275142 +0000 UTC m=+0.164520978 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3) Feb 20 03:47:40 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:47:40 localhost podman[101844]: 2026-02-20 08:47:40.322073258 +0000 UTC m=+0.246444471 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:47:40 localhost podman[101838]: 2026-02-20 08:47:40.184788273 +0000 UTC m=+0.109744595 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, vcs-type=git) Feb 20 03:47:40 localhost podman[101844]: 2026-02-20 08:47:40.351964461 +0000 UTC m=+0.276335634 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Feb 20 03:47:40 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:47:40 localhost podman[101838]: 2026-02-20 08:47:40.37036393 +0000 UTC m=+0.295320192 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:47:40 localhost podman[101838]: unhealthy Feb 20 03:47:40 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:47:40 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:47:40 localhost podman[101836]: 2026-02-20 08:47:40.370969579 +0000 UTC m=+0.307490968 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, release=1766032510, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:47:40 localhost podman[101836]: 2026-02-20 08:47:40.457060731 +0000 UTC m=+0.393582120 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1) Feb 20 03:47:40 localhost podman[101836]: unhealthy Feb 20 03:47:40 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:47:40 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:47:48 localhost podman[101922]: 2026-02-20 08:47:48.144344153 +0000 UTC m=+0.081006646 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64) Feb 20 03:47:48 localhost podman[101922]: 2026-02-20 08:47:48.517014395 +0000 UTC m=+0.453676838 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510) Feb 20 03:47:48 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:47:58 localhost sshd[101945]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:48:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:48:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:48:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:48:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:48:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:48:04 localhost systemd[1]: tmp-crun.kQ6miA.mount: Deactivated successfully. Feb 20 03:48:04 localhost podman[101949]: 2026-02-20 08:48:04.224813135 +0000 UTC m=+0.156436718 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:48:04 localhost podman[101951]: 2026-02-20 08:48:04.180048031 +0000 UTC m=+0.105993128 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13) Feb 20 03:48:04 localhost podman[101948]: 2026-02-20 08:48:04.314805667 +0000 UTC m=+0.245176651 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible) Feb 20 03:48:04 localhost podman[101949]: 2026-02-20 08:48:04.334228638 +0000 UTC m=+0.265852221 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 20 03:48:04 localhost podman[101951]: 2026-02-20 08:48:04.361678807 +0000 UTC m=+0.287623934 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:48:04 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:48:04 localhost podman[101947]: 2026-02-20 08:48:04.373832082 +0000 UTC m=+0.306262440 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, tcib_managed=true) Feb 20 03:48:04 localhost podman[101950]: 2026-02-20 08:48:04.418837224 +0000 UTC m=+0.341756298 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, vcs-type=git) Feb 20 03:48:04 localhost podman[101950]: 2026-02-20 08:48:04.442094074 +0000 UTC m=+0.365013178 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1) Feb 20 03:48:04 localhost podman[101948]: 2026-02-20 08:48:04.447244322 +0000 UTC m=+0.377615276 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64) Feb 20 03:48:04 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:48:04 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:48:04 localhost podman[101947]: 2026-02-20 08:48:04.493206074 +0000 UTC m=+0.425636472 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z) Feb 20 03:48:04 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:48:04 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:48:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:48:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:48:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:48:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:48:11 localhost systemd[1]: tmp-crun.TvzD8b.mount: Deactivated successfully. Feb 20 03:48:11 localhost podman[102070]: 2026-02-20 08:48:11.171745661 +0000 UTC m=+0.100339354 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5) Feb 20 03:48:11 localhost podman[102070]: 2026-02-20 08:48:11.217170455 +0000 UTC m=+0.145764188 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:48:11 localhost podman[102070]: unhealthy Feb 20 03:48:11 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:48:11 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:48:11 localhost podman[102069]: 2026-02-20 08:48:11.266624304 +0000 UTC m=+0.199490898 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 20 03:48:11 localhost podman[102068]: 2026-02-20 08:48:11.221270502 +0000 UTC m=+0.158191572 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:48:11 localhost podman[102069]: 2026-02-20 08:48:11.304463804 +0000 UTC m=+0.237330438 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 20 03:48:11 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:48:11 localhost podman[102076]: 2026-02-20 08:48:11.322189242 +0000 UTC m=+0.244122179 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step5, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 20 03:48:11 localhost podman[102068]: 2026-02-20 08:48:11.356352458 +0000 UTC m=+0.293273508 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:48:11 localhost podman[102068]: unhealthy Feb 20 03:48:11 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:48:11 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:48:11 localhost podman[102076]: 2026-02-20 08:48:11.380230636 +0000 UTC m=+0.302163573 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, batch=17.1_20260112.1, config_id=tripleo_step5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510) Feb 20 03:48:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:48:19 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:48:19 localhost recover_tripleo_nova_virtqemud[102154]: 63005 Feb 20 03:48:19 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:48:19 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:48:19 localhost systemd[1]: tmp-crun.LTGRKy.mount: Deactivated successfully. Feb 20 03:48:19 localhost podman[102150]: 2026-02-20 08:48:19.159221556 +0000 UTC m=+0.097073223 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 20 03:48:19 localhost podman[102150]: 2026-02-20 08:48:19.566107275 +0000 UTC m=+0.503958852 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:48:19 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:48:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:48:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:48:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:48:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:48:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:48:35 localhost podman[102253]: 2026-02-20 08:48:35.159357706 +0000 UTC m=+0.086486276 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:48:35 localhost podman[102253]: 2026-02-20 08:48:35.196071201 +0000 UTC m=+0.123199741 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, release=1766032510) Feb 20 03:48:35 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:48:35 localhost podman[102255]: 2026-02-20 08:48:35.2119186 +0000 UTC m=+0.134138588 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, container_name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510) Feb 20 03:48:35 localhost systemd[1]: tmp-crun.sBl2Jr.mount: Deactivated successfully. Feb 20 03:48:35 localhost podman[102254]: 2026-02-20 08:48:35.270823771 +0000 UTC m=+0.193631327 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:48:35 localhost podman[102254]: 2026-02-20 08:48:35.311364755 +0000 UTC m=+0.234172391 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Feb 20 03:48:35 localhost podman[102251]: 2026-02-20 08:48:35.319411024 +0000 UTC m=+0.247912767 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 20 03:48:35 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:48:35 localhost podman[102252]: 2026-02-20 08:48:35.376385495 +0000 UTC m=+0.304146025 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:48:35 localhost podman[102251]: 2026-02-20 08:48:35.387566411 +0000 UTC m=+0.316068164 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:48:35 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:48:35 localhost podman[102255]: 2026-02-20 08:48:35.402921356 +0000 UTC m=+0.325141404 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team) Feb 20 03:48:35 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:48:35 localhost podman[102252]: 2026-02-20 08:48:35.439770165 +0000 UTC m=+0.367530655 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:48:35 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:48:42 localhost sshd[102367]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:48:42 localhost podman[102369]: 2026-02-20 08:48:42.151053482 +0000 UTC m=+0.086615969 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, tcib_managed=true) Feb 20 03:48:42 localhost podman[102369]: 2026-02-20 08:48:42.168004206 +0000 UTC m=+0.103566693 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:48:42 localhost podman[102369]: unhealthy Feb 20 03:48:42 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:48:42 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:48:42 localhost podman[102376]: 2026-02-20 08:48:42.214948547 +0000 UTC m=+0.138626386 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 20 03:48:42 localhost podman[102371]: 2026-02-20 08:48:42.264978664 +0000 UTC m=+0.192835513 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:48:42 localhost podman[102370]: 2026-02-20 08:48:42.299720278 +0000 UTC m=+0.231974052 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:48:42 localhost podman[102371]: 2026-02-20 08:48:42.306017324 +0000 UTC m=+0.233874133 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1) Feb 20 03:48:42 localhost podman[102371]: unhealthy Feb 20 03:48:42 localhost podman[102376]: 2026-02-20 08:48:42.320548923 +0000 UTC m=+0.244226752 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, batch=17.1_20260112.1, container_name=nova_compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:48:42 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:48:42 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:48:42 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:48:42 localhost podman[102370]: 2026-02-20 08:48:42.359960271 +0000 UTC m=+0.292214035 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:48:42 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:48:43 localhost systemd[1]: tmp-crun.H0xRD2.mount: Deactivated successfully. Feb 20 03:48:49 localhost sshd[102453]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:48:50 localhost podman[102455]: 2026-02-20 08:48:50.142282883 +0000 UTC m=+0.081028716 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:48:50 localhost podman[102455]: 2026-02-20 08:48:50.513106718 +0000 UTC m=+0.451852521 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container) Feb 20 03:48:50 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:49:06 localhost podman[102488]: 2026-02-20 08:49:06.150743216 +0000 UTC m=+0.080517700 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-type=git, container_name=metrics_qdr) Feb 20 03:49:06 localhost podman[102482]: 2026-02-20 08:49:06.208542373 +0000 UTC m=+0.140082482 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com) Feb 20 03:49:06 localhost systemd[1]: tmp-crun.vqoOob.mount: Deactivated successfully. Feb 20 03:49:06 localhost podman[102480]: 2026-02-20 08:49:06.263781192 +0000 UTC m=+0.200889703 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 20 03:49:06 localhost podman[102482]: 2026-02-20 08:49:06.316350477 +0000 UTC m=+0.247890576 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:49:06 localhost podman[102481]: 2026-02-20 08:49:06.323329922 +0000 UTC m=+0.258363328 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z) Feb 20 03:49:06 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:49:06 localhost podman[102481]: 2026-02-20 08:49:06.360048158 +0000 UTC m=+0.295081584 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510) Feb 20 03:49:06 localhost podman[102488]: 2026-02-20 08:49:06.371322726 +0000 UTC m=+0.301097190 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13) Feb 20 03:49:06 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:49:06 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:49:06 localhost podman[102480]: 2026-02-20 08:49:06.395970258 +0000 UTC m=+0.333078769 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, release=1766032510, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:49:06 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:49:06 localhost podman[102479]: 2026-02-20 08:49:06.364471865 +0000 UTC m=+0.303950329 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:49:06 localhost podman[102479]: 2026-02-20 08:49:06.452060713 +0000 UTC m=+0.391539137 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 20 03:49:06 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:49:13 localhost systemd[1]: tmp-crun.1CsCQd.mount: Deactivated successfully. Feb 20 03:49:13 localhost podman[102600]: 2026-02-20 08:49:13.158183771 +0000 UTC m=+0.082615006 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:49:13 localhost podman[102600]: 2026-02-20 08:49:13.210047124 +0000 UTC m=+0.134478369 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible) Feb 20 03:49:13 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:49:13 localhost podman[102599]: 2026-02-20 08:49:13.215548364 +0000 UTC m=+0.142642921 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:49:13 localhost podman[102599]: 2026-02-20 08:49:13.299957344 +0000 UTC m=+0.227051971 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:49:13 localhost podman[102599]: unhealthy Feb 20 03:49:13 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:49:13 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:49:13 localhost podman[102597]: 2026-02-20 08:49:13.315875726 +0000 UTC m=+0.246611425 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 20 03:49:13 localhost podman[102597]: 2026-02-20 08:49:13.35640171 +0000 UTC m=+0.287137449 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible) Feb 20 03:49:13 localhost podman[102597]: unhealthy Feb 20 03:49:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:49:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:49:13 localhost podman[102598]: 2026-02-20 08:49:13.266869851 +0000 UTC m=+0.197983732 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public) Feb 20 03:49:13 localhost podman[102598]: 2026-02-20 08:49:13.402402401 +0000 UTC m=+0.333516242 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, release=1766032510, container_name=iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:49:13 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:49:21 localhost podman[102682]: 2026-02-20 08:49:21.126013388 +0000 UTC m=+0.069045656 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:49:21 localhost podman[102682]: 2026-02-20 08:49:21.489090354 +0000 UTC m=+0.432122572 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:49:21 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:49:25 localhost sshd[102705]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:49:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:49:36 localhost recover_tripleo_nova_virtqemud[102786]: 63005 Feb 20 03:49:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:49:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:49:37 localhost systemd[1]: tmp-crun.UBv4t6.mount: Deactivated successfully. Feb 20 03:49:37 localhost podman[102788]: 2026-02-20 08:49:37.173780039 +0000 UTC m=+0.096892587 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp-rhel9/openstack-cron) Feb 20 03:49:37 localhost podman[102788]: 2026-02-20 08:49:37.184969555 +0000 UTC m=+0.108082103 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public) Feb 20 03:49:37 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:49:37 localhost systemd[1]: tmp-crun.Wcf7yq.mount: Deactivated successfully. Feb 20 03:49:37 localhost podman[102787]: 2026-02-20 08:49:37.27794772 +0000 UTC m=+0.202787522 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 20 03:49:37 localhost podman[102789]: 2026-02-20 08:49:37.32648372 +0000 UTC m=+0.249950319 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1766032510, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:49:37 localhost podman[102791]: 2026-02-20 08:49:37.245453055 +0000 UTC m=+0.162443174 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 20 03:49:37 localhost podman[102789]: 2026-02-20 08:49:37.365005141 +0000 UTC m=+0.288471740 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:49:37 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:49:37 localhost podman[102790]: 2026-02-20 08:49:37.379613023 +0000 UTC m=+0.300424349 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:49:37 localhost podman[102787]: 2026-02-20 08:49:37.402238792 +0000 UTC m=+0.327078624 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.) Feb 20 03:49:37 localhost podman[102790]: 2026-02-20 08:49:37.409988352 +0000 UTC m=+0.330799638 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:49:37 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:49:37 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:49:37 localhost podman[102791]: 2026-02-20 08:49:37.453989003 +0000 UTC m=+0.370979032 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:49:37 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:49:41 localhost sshd[102904]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:49:44 localhost systemd[1]: tmp-crun.xCE988.mount: Deactivated successfully. Feb 20 03:49:44 localhost podman[102907]: 2026-02-20 08:49:44.157005945 +0000 UTC m=+0.095779613 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 20 03:49:44 localhost podman[102907]: 2026-02-20 08:49:44.194088472 +0000 UTC m=+0.132862060 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Feb 20 03:49:44 localhost podman[102909]: 2026-02-20 08:49:44.205488184 +0000 UTC m=+0.138214585 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:49:44 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:49:44 localhost podman[102906]: 2026-02-20 08:49:44.256751179 +0000 UTC m=+0.196326971 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:49:44 localhost podman[102906]: 2026-02-20 08:49:44.274026213 +0000 UTC m=+0.213602005 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:49:44 localhost podman[102906]: unhealthy Feb 20 03:49:44 localhost podman[102909]: 2026-02-20 08:49:44.28655399 +0000 UTC m=+0.219280441 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 20 03:49:44 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:49:44 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:49:44 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:49:44 localhost podman[102908]: 2026-02-20 08:49:44.369812594 +0000 UTC m=+0.303560776 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Feb 20 03:49:44 localhost podman[102908]: 2026-02-20 08:49:44.38812059 +0000 UTC m=+0.321868792 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:49:44 localhost podman[102908]: unhealthy Feb 20 03:49:44 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:49:44 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:49:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:49:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:49:49 localhost sshd[102990]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:49:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:49:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:49:52 localhost podman[102992]: 2026-02-20 08:49:52.1407247 +0000 UTC m=+0.078660223 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:49:52 localhost podman[102992]: 2026-02-20 08:49:52.514999702 +0000 UTC m=+0.452935215 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:49:52 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:50:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:50:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:50:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:50:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:50:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:50:08 localhost podman[103017]: 2026-02-20 08:50:08.168010846 +0000 UTC m=+0.095091521 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13) Feb 20 03:50:08 localhost systemd[1]: tmp-crun.KDiTRe.mount: Deactivated successfully. Feb 20 03:50:08 localhost podman[103015]: 2026-02-20 08:50:08.223232614 +0000 UTC m=+0.157031506 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 20 03:50:08 localhost podman[103021]: 2026-02-20 08:50:08.280686661 +0000 UTC m=+0.200578473 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Feb 20 03:50:08 localhost podman[103021]: 2026-02-20 08:50:08.314153345 +0000 UTC m=+0.234045147 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64) Feb 20 03:50:08 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:50:08 localhost podman[103016]: 2026-02-20 08:50:08.339039274 +0000 UTC m=+0.269185584 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Feb 20 03:50:08 localhost podman[103017]: 2026-02-20 08:50:08.350489609 +0000 UTC m=+0.277570254 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:50:08 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:50:08 localhost podman[103016]: 2026-02-20 08:50:08.378103952 +0000 UTC m=+0.308250222 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Feb 20 03:50:08 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:50:08 localhost podman[103025]: 2026-02-20 08:50:08.438910462 +0000 UTC m=+0.355422930 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=) Feb 20 03:50:08 localhost podman[103015]: 2026-02-20 08:50:08.456262598 +0000 UTC m=+0.390061440 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vcs-type=git, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 20 03:50:08 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:50:08 localhost podman[103025]: 2026-02-20 08:50:08.641412123 +0000 UTC m=+0.557924671 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 20 03:50:08 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:50:10 localhost sshd[103133]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:50:15 localhost podman[103140]: 2026-02-20 08:50:15.187265595 +0000 UTC m=+0.111589781 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, version=17.1.13, container_name=nova_compute, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5) Feb 20 03:50:15 localhost systemd[1]: tmp-crun.6BynHI.mount: Deactivated successfully. Feb 20 03:50:15 localhost podman[103135]: 2026-02-20 08:50:15.241161562 +0000 UTC m=+0.177173969 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4) Feb 20 03:50:15 localhost podman[103135]: 2026-02-20 08:50:15.257980932 +0000 UTC m=+0.193993329 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, container_name=ovn_controller) Feb 20 03:50:15 localhost podman[103140]: 2026-02-20 08:50:15.266878767 +0000 UTC m=+0.191202923 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, container_name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:50:15 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:50:15 localhost podman[103135]: unhealthy Feb 20 03:50:15 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:50:15 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:50:15 localhost podman[103137]: 2026-02-20 08:50:15.354771365 +0000 UTC m=+0.283829597 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:50:15 localhost podman[103136]: 2026-02-20 08:50:15.397276239 +0000 UTC m=+0.329516980 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc.) Feb 20 03:50:15 localhost podman[103136]: 2026-02-20 08:50:15.412087416 +0000 UTC m=+0.344328187 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13) Feb 20 03:50:15 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:50:15 localhost podman[103137]: 2026-02-20 08:50:15.450429422 +0000 UTC m=+0.379487664 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 20 03:50:15 localhost podman[103137]: unhealthy Feb 20 03:50:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:50:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:50:23 localhost podman[103220]: 2026-02-20 08:50:23.145560507 +0000 UTC m=+0.082544014 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.13) Feb 20 03:50:23 localhost podman[103220]: 2026-02-20 08:50:23.513000057 +0000 UTC m=+0.449983584 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:50:23 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:50:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:50:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:50:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:50:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:50:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:50:39 localhost systemd[1]: tmp-crun.4QZLik.mount: Deactivated successfully. Feb 20 03:50:39 localhost podman[103324]: 2026-02-20 08:50:39.162193574 +0000 UTC m=+0.084933836 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:50:39 localhost systemd[1]: tmp-crun.4ihxHi.mount: Deactivated successfully. Feb 20 03:50:39 localhost podman[103321]: 2026-02-20 08:50:39.233420927 +0000 UTC m=+0.160148162 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5) Feb 20 03:50:39 localhost podman[103324]: 2026-02-20 08:50:39.238010378 +0000 UTC m=+0.160750690 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:50:39 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:50:39 localhost podman[103321]: 2026-02-20 08:50:39.29302958 +0000 UTC m=+0.219756775 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi) Feb 20 03:50:39 localhost podman[103329]: 2026-02-20 08:50:39.292164093 +0000 UTC m=+0.205391592 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1) Feb 20 03:50:39 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:50:39 localhost podman[103323]: 2026-02-20 08:50:39.259263966 +0000 UTC m=+0.181836123 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:50:39 localhost podman[103323]: 2026-02-20 08:50:39.339159316 +0000 UTC m=+0.261731433 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:50:39 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:50:39 localhost podman[103322]: 2026-02-20 08:50:39.431934745 +0000 UTC m=+0.357152084 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com) Feb 20 03:50:39 localhost podman[103322]: 2026-02-20 08:50:39.468417843 +0000 UTC m=+0.393635252 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 20 03:50:39 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:50:39 localhost podman[103329]: 2026-02-20 08:50:39.511154084 +0000 UTC m=+0.424381583 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:50:39 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:50:46 localhost podman[103448]: 2026-02-20 08:50:46.160457064 +0000 UTC m=+0.089188257 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:50:46 localhost systemd[1]: tmp-crun.satLMg.mount: Deactivated successfully. Feb 20 03:50:46 localhost podman[103448]: 2026-02-20 08:50:46.197459659 +0000 UTC m=+0.126190872 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:50:46 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:50:46 localhost podman[103441]: 2026-02-20 08:50:46.216454575 +0000 UTC m=+0.149780661 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vcs-type=git, build-date=2026-01-12T22:34:43Z, container_name=iscsid, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:50:46 localhost podman[103440]: 2026-02-20 08:50:46.200473452 +0000 UTC m=+0.139002609 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 20 03:50:46 localhost podman[103442]: 2026-02-20 08:50:46.253844742 +0000 UTC m=+0.183611168 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4) Feb 20 03:50:46 localhost podman[103442]: 2026-02-20 08:50:46.270953721 +0000 UTC m=+0.200720107 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:50:46 localhost podman[103442]: unhealthy Feb 20 03:50:46 localhost podman[103441]: 2026-02-20 08:50:46.278300568 +0000 UTC m=+0.211626664 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, container_name=iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public) Feb 20 03:50:46 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:50:46 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:50:46 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:50:46 localhost podman[103440]: 2026-02-20 08:50:46.333975189 +0000 UTC m=+0.272504306 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:50:46 localhost podman[103440]: unhealthy Feb 20 03:50:46 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:50:46 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:50:54 localhost podman[103524]: 2026-02-20 08:50:54.14197936 +0000 UTC m=+0.081271484 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, architecture=x86_64) Feb 20 03:50:54 localhost podman[103524]: 2026-02-20 08:50:54.512280759 +0000 UTC m=+0.451572913 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:50:54 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:50:56 localhost sshd[103548]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:50:59 localhost sshd[103550]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:51:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:51:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:51:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:51:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:51:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:51:10 localhost podman[103552]: 2026-02-20 08:51:10.172740641 +0000 UTC m=+0.104876634 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:51:10 localhost podman[103552]: 2026-02-20 08:51:10.204837823 +0000 UTC m=+0.136973836 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4) Feb 20 03:51:10 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:51:10 localhost podman[103555]: 2026-02-20 08:51:10.22283851 +0000 UTC m=+0.149019008 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 20 03:51:10 localhost podman[103555]: 2026-02-20 08:51:10.259863375 +0000 UTC m=+0.186043913 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc.) Feb 20 03:51:10 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:51:10 localhost podman[103556]: 2026-02-20 08:51:10.274284901 +0000 UTC m=+0.198027794 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:51:10 localhost podman[103554]: 2026-02-20 08:51:10.315493025 +0000 UTC m=+0.244982286 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:51:10 localhost podman[103554]: 2026-02-20 08:51:10.326999251 +0000 UTC m=+0.256488512 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:51:10 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:51:10 localhost podman[103553]: 2026-02-20 08:51:10.453851052 +0000 UTC m=+0.384535659 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=logrotate_crond, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:51:10 localhost podman[103553]: 2026-02-20 08:51:10.489091482 +0000 UTC m=+0.419776059 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, container_name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 20 03:51:10 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:51:10 localhost podman[103556]: 2026-02-20 08:51:10.548412726 +0000 UTC m=+0.472155649 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1766032510, tcib_managed=true, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:51:10 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:51:17 localhost podman[103671]: 2026-02-20 08:51:17.141723496 +0000 UTC m=+0.074567717 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z) Feb 20 03:51:17 localhost podman[103670]: 2026-02-20 08:51:17.15574956 +0000 UTC m=+0.088127266 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:36:40Z, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:51:17 localhost systemd[1]: tmp-crun.tMCoeB.mount: Deactivated successfully. Feb 20 03:51:17 localhost podman[103672]: 2026-02-20 08:51:17.21752475 +0000 UTC m=+0.147974626 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:51:17 localhost podman[103670]: 2026-02-20 08:51:17.22561023 +0000 UTC m=+0.157987966 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:51:17 localhost podman[103670]: unhealthy Feb 20 03:51:17 localhost podman[103672]: 2026-02-20 08:51:17.237095895 +0000 UTC m=+0.167545771 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.13) Feb 20 03:51:17 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:51:17 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:51:17 localhost podman[103672]: unhealthy Feb 20 03:51:17 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:51:17 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:51:17 localhost podman[103673]: 2026-02-20 08:51:17.31459298 +0000 UTC m=+0.236186383 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container) Feb 20 03:51:17 localhost podman[103671]: 2026-02-20 08:51:17.332430562 +0000 UTC m=+0.265274793 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:51:17 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:51:17 localhost podman[103673]: 2026-02-20 08:51:17.380282982 +0000 UTC m=+0.301876415 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container) Feb 20 03:51:17 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:51:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:51:25 localhost recover_tripleo_nova_virtqemud[103758]: 63005 Feb 20 03:51:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:51:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:51:25 localhost podman[103756]: 2026-02-20 08:51:25.13724181 +0000 UTC m=+0.077047614 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, release=1766032510) Feb 20 03:51:25 localhost podman[103756]: 2026-02-20 08:51:25.506884148 +0000 UTC m=+0.446689942 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 20 03:51:25 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:51:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:51:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:51:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:51:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:51:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:51:41 localhost systemd[1]: tmp-crun.8M6QAu.mount: Deactivated successfully. Feb 20 03:51:41 localhost podman[103910]: 2026-02-20 08:51:41.162573384 +0000 UTC m=+0.101835949 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:51:41 localhost podman[103910]: 2026-02-20 08:51:41.172873373 +0000 UTC m=+0.112135978 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, batch=17.1_20260112.1) Feb 20 03:51:41 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:51:41 localhost podman[103909]: 2026-02-20 08:51:41.216585184 +0000 UTC m=+0.154663963 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true) Feb 20 03:51:41 localhost podman[103911]: 2026-02-20 08:51:41.252817084 +0000 UTC m=+0.187121266 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:51:41 localhost podman[103911]: 2026-02-20 08:51:41.268615603 +0000 UTC m=+0.202919815 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, container_name=collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:51:41 localhost podman[103912]: 2026-02-20 08:51:41.180140177 +0000 UTC m=+0.113001685 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible) Feb 20 03:51:41 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:51:41 localhost podman[103912]: 2026-02-20 08:51:41.315208974 +0000 UTC m=+0.248070532 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z) Feb 20 03:51:41 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:51:41 localhost podman[103914]: 2026-02-20 08:51:41.362096923 +0000 UTC m=+0.285182708 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com) Feb 20 03:51:41 localhost podman[103909]: 2026-02-20 08:51:41.401028386 +0000 UTC m=+0.339107115 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z) Feb 20 03:51:41 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:51:41 localhost podman[103914]: 2026-02-20 08:51:41.548116465 +0000 UTC m=+0.471202260 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, container_name=metrics_qdr, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1) Feb 20 03:51:41 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:51:43 localhost sshd[104027]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:51:48 localhost podman[104029]: 2026-02-20 08:51:48.153791409 +0000 UTC m=+0.090065375 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:51:48 localhost podman[104030]: 2026-02-20 08:51:48.211130821 +0000 UTC m=+0.143153107 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 20 03:51:48 localhost podman[104030]: 2026-02-20 08:51:48.220871833 +0000 UTC m=+0.152894079 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:51:48 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:51:48 localhost podman[104031]: 2026-02-20 08:51:48.272985054 +0000 UTC m=+0.202989417 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, vendor=Red Hat, Inc.) Feb 20 03:51:48 localhost podman[104031]: 2026-02-20 08:51:48.294048435 +0000 UTC m=+0.224052848 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=ovn_metadata_agent) Feb 20 03:51:48 localhost podman[104031]: unhealthy Feb 20 03:51:48 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:51:48 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:51:48 localhost podman[104029]: 2026-02-20 08:51:48.328527171 +0000 UTC m=+0.264801137 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1) Feb 20 03:51:48 localhost podman[104029]: unhealthy Feb 20 03:51:48 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:51:48 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:51:48 localhost podman[104037]: 2026-02-20 08:51:48.367044862 +0000 UTC m=+0.292676640 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 20 03:51:48 localhost podman[104037]: 2026-02-20 08:51:48.395942955 +0000 UTC m=+0.321574753 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:51:48 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:51:56 localhost podman[104111]: 2026-02-20 08:51:56.138468888 +0000 UTC m=+0.075872447 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:51:56 localhost podman[104111]: 2026-02-20 08:51:56.513959687 +0000 UTC m=+0.451363266 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=nova_migration_target, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:51:56 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:52:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:52:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:52:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:52:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:52:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:52:12 localhost podman[104136]: 2026-02-20 08:52:12.157920494 +0000 UTC m=+0.093078049 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 20 03:52:12 localhost podman[104136]: 2026-02-20 08:52:12.166945764 +0000 UTC m=+0.102103329 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, vcs-type=git, config_id=tripleo_step3, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 20 03:52:12 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:52:12 localhost podman[104134]: 2026-02-20 08:52:12.21083111 +0000 UTC m=+0.151121723 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:52:12 localhost podman[104137]: 2026-02-20 08:52:12.264626284 +0000 UTC m=+0.192530784 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:52:12 localhost podman[104137]: 2026-02-20 08:52:12.289935746 +0000 UTC m=+0.217840236 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com) Feb 20 03:52:12 localhost podman[104134]: 2026-02-20 08:52:12.289004547 +0000 UTC m=+0.229295120 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, release=1766032510, version=17.1.13, architecture=x86_64) Feb 20 03:52:12 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:52:12 localhost podman[104135]: 2026-02-20 08:52:12.373158309 +0000 UTC m=+0.309149269 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 20 03:52:12 localhost podman[104135]: 2026-02-20 08:52:12.38193922 +0000 UTC m=+0.317930190 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1) Feb 20 03:52:12 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:52:12 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:52:12 localhost podman[104144]: 2026-02-20 08:52:12.482457148 +0000 UTC m=+0.407948813 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step1, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:52:12 localhost podman[104144]: 2026-02-20 08:52:12.690015895 +0000 UTC m=+0.615507590 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5) Feb 20 03:52:12 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:52:19 localhost systemd[1]: tmp-crun.mrsJU8.mount: Deactivated successfully. Feb 20 03:52:19 localhost podman[104261]: 2026-02-20 08:52:19.153912824 +0000 UTC m=+0.080918663 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, release=1766032510, vcs-type=git, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible) Feb 20 03:52:19 localhost podman[104253]: 2026-02-20 08:52:19.200307228 +0000 UTC m=+0.137373548 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 20 03:52:19 localhost podman[104261]: 2026-02-20 08:52:19.230951805 +0000 UTC m=+0.157957644 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Feb 20 03:52:19 localhost podman[104253]: 2026-02-20 08:52:19.240311305 +0000 UTC m=+0.177377625 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 20 03:52:19 localhost podman[104253]: unhealthy Feb 20 03:52:19 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:52:19 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:52:19 localhost podman[104255]: 2026-02-20 08:52:19.264470702 +0000 UTC m=+0.194650229 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:52:19 localhost podman[104255]: 2026-02-20 08:52:19.310146085 +0000 UTC m=+0.240325632 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 20 03:52:19 localhost podman[104255]: unhealthy Feb 20 03:52:19 localhost podman[104254]: 2026-02-20 08:52:19.317170201 +0000 UTC m=+0.248492434 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:52:19 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:52:19 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:52:19 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:52:19 localhost podman[104254]: 2026-02-20 08:52:19.35626135 +0000 UTC m=+0.287583563 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Feb 20 03:52:19 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:52:19 localhost sshd[104338]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:52:20 localhost systemd[1]: tmp-crun.hJgdgX.mount: Deactivated successfully. Feb 20 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:52:27 localhost systemd[1]: tmp-crun.cEhAdq.mount: Deactivated successfully. Feb 20 03:52:27 localhost podman[104341]: 2026-02-20 08:52:27.145811996 +0000 UTC m=+0.085234087 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, vcs-type=git) Feb 20 03:52:27 localhost podman[104341]: 2026-02-20 08:52:27.535654569 +0000 UTC m=+0.475076590 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 20 03:52:27 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:52:30 localhost sshd[104364]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:52:39 localhost sshd[104444]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:52:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:52:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:52:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:52:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:52:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:52:43 localhost podman[104446]: 2026-02-20 08:52:43.171733769 +0000 UTC m=+0.101513659 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 20 03:52:43 localhost podman[104446]: 2026-02-20 08:52:43.203032647 +0000 UTC m=+0.132812487 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:52:43 localhost systemd[1]: tmp-crun.XAhXxQ.mount: Deactivated successfully. Feb 20 03:52:43 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:52:43 localhost podman[104449]: 2026-02-20 08:52:43.223664535 +0000 UTC m=+0.148895334 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, distribution-scope=public, version=17.1.13, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Feb 20 03:52:43 localhost podman[104449]: 2026-02-20 08:52:43.259113372 +0000 UTC m=+0.184344211 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 20 03:52:43 localhost podman[104448]: 2026-02-20 08:52:43.271117532 +0000 UTC m=+0.197519538 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:52:43 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:52:43 localhost podman[104450]: 2026-02-20 08:52:43.325009388 +0000 UTC m=+0.248395941 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, container_name=metrics_qdr) Feb 20 03:52:43 localhost podman[104447]: 2026-02-20 08:52:43.376841921 +0000 UTC m=+0.303609328 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:52:43 localhost podman[104447]: 2026-02-20 08:52:43.394213328 +0000 UTC m=+0.320980745 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:52:43 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:52:43 localhost podman[104448]: 2026-02-20 08:52:43.448015201 +0000 UTC m=+0.374417127 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:52:43 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:52:43 localhost podman[104450]: 2026-02-20 08:52:43.527701835 +0000 UTC m=+0.451088368 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, tcib_managed=true, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 20 03:52:43 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:52:46 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:52:46 localhost recover_tripleo_nova_virtqemud[104566]: 63005 Feb 20 03:52:46 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:52:46 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:52:50 localhost systemd[1]: tmp-crun.Qm3dry.mount: Deactivated successfully. Feb 20 03:52:50 localhost podman[104569]: 2026-02-20 08:52:50.146706559 +0000 UTC m=+0.078163058 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:52:50 localhost podman[104567]: 2026-02-20 08:52:50.210743599 +0000 UTC m=+0.145768078 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Feb 20 03:52:50 localhost podman[104569]: 2026-02-20 08:52:50.230941713 +0000 UTC m=+0.162398252 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 20 03:52:50 localhost podman[104569]: unhealthy Feb 20 03:52:50 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:52:50 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:52:50 localhost podman[104570]: 2026-02-20 08:52:50.191163483 +0000 UTC m=+0.116822983 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, architecture=x86_64, url=https://www.redhat.com) Feb 20 03:52:50 localhost podman[104568]: 2026-02-20 08:52:50.250994113 +0000 UTC m=+0.186218618 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:52:50 localhost podman[104568]: 2026-02-20 08:52:50.268129812 +0000 UTC m=+0.203354247 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Feb 20 03:52:50 localhost podman[104570]: 2026-02-20 08:52:50.280147105 +0000 UTC m=+0.205806585 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:52:50 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:52:50 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:52:50 localhost podman[104567]: 2026-02-20 08:52:50.304241949 +0000 UTC m=+0.239266448 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:52:50 localhost podman[104567]: unhealthy Feb 20 03:52:50 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:52:50 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:52:58 localhost systemd[1]: tmp-crun.U1pN5e.mount: Deactivated successfully. Feb 20 03:52:58 localhost podman[104646]: 2026-02-20 08:52:58.151809999 +0000 UTC m=+0.091562153 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 20 03:52:58 localhost podman[104646]: 2026-02-20 08:52:58.527983069 +0000 UTC m=+0.467735253 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Feb 20 03:52:58 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:53:01 localhost sshd[104669]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:53:14 localhost systemd[1]: tmp-crun.fdum35.mount: Deactivated successfully. Feb 20 03:53:14 localhost podman[104671]: 2026-02-20 08:53:14.14907752 +0000 UTC m=+0.084508264 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 20 03:53:14 localhost podman[104673]: 2026-02-20 08:53:14.166151007 +0000 UTC m=+0.094715549 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Feb 20 03:53:14 localhost podman[104671]: 2026-02-20 08:53:14.177991033 +0000 UTC m=+0.113421787 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1) Feb 20 03:53:14 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:53:14 localhost podman[104673]: 2026-02-20 08:53:14.206001539 +0000 UTC m=+0.134566041 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 20 03:53:14 localhost podman[104672]: 2026-02-20 08:53:14.215868695 +0000 UTC m=+0.147049718 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:53:14 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:53:14 localhost podman[104672]: 2026-02-20 08:53:14.225899565 +0000 UTC m=+0.157080518 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:53:14 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:53:14 localhost podman[104679]: 2026-02-20 08:53:14.286613762 +0000 UTC m=+0.211180560 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:53:14 localhost podman[104679]: 2026-02-20 08:53:14.317974891 +0000 UTC m=+0.242541769 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 20 03:53:14 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:53:14 localhost podman[104683]: 2026-02-20 08:53:14.331141049 +0000 UTC m=+0.252409715 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:53:14 localhost podman[104683]: 2026-02-20 08:53:14.53107211 +0000 UTC m=+0.452340776 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:53:14 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:53:15 localhost systemd[1]: tmp-crun.ULcXmc.mount: Deactivated successfully. Feb 20 03:53:15 localhost sshd[104788]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:53:21 localhost systemd[1]: tmp-crun.aJmwPh.mount: Deactivated successfully. Feb 20 03:53:21 localhost podman[104790]: 2026-02-20 08:53:21.172908852 +0000 UTC m=+0.106234556 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:53:21 localhost podman[104790]: 2026-02-20 08:53:21.189293488 +0000 UTC m=+0.122619242 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5) Feb 20 03:53:21 localhost podman[104790]: unhealthy Feb 20 03:53:21 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:53:21 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:53:21 localhost systemd[1]: tmp-crun.CaygEn.mount: Deactivated successfully. Feb 20 03:53:21 localhost podman[104791]: 2026-02-20 08:53:21.269772027 +0000 UTC m=+0.201228163 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:53:21 localhost podman[104791]: 2026-02-20 08:53:21.280108137 +0000 UTC m=+0.211564253 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, container_name=iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team) Feb 20 03:53:21 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:53:21 localhost podman[104792]: 2026-02-20 08:53:21.329034329 +0000 UTC m=+0.254870121 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:53:21 localhost podman[104792]: 2026-02-20 08:53:21.346998665 +0000 UTC m=+0.272834447 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:53:21 localhost podman[104792]: unhealthy Feb 20 03:53:21 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:53:21 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:53:21 localhost podman[104796]: 2026-02-20 08:53:21.39990848 +0000 UTC m=+0.322176292 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=) Feb 20 03:53:21 localhost podman[104796]: 2026-02-20 08:53:21.430876897 +0000 UTC m=+0.353144709 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510) Feb 20 03:53:21 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:53:29 localhost systemd[1]: tmp-crun.6F3v2d.mount: Deactivated successfully. Feb 20 03:53:29 localhost podman[104875]: 2026-02-20 08:53:29.147910851 +0000 UTC m=+0.086856356 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 20 03:53:29 localhost podman[104875]: 2026-02-20 08:53:29.509512381 +0000 UTC m=+0.448457876 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 20 03:53:29 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:53:38 localhost podman[104998]: 2026-02-20 08:53:38.696951526 +0000 UTC m=+0.093682728 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, name=rhceph, version=7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 03:53:38 localhost podman[104998]: 2026-02-20 08:53:38.798026691 +0000 UTC m=+0.194757893 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux ) Feb 20 03:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:53:45 localhost podman[105145]: 2026-02-20 08:53:45.155383824 +0000 UTC m=+0.085526925 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:53:45 localhost podman[105143]: 2026-02-20 08:53:45.203508452 +0000 UTC m=+0.138640287 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, release=1766032510, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:53:45 localhost podman[105145]: 2026-02-20 08:53:45.210140607 +0000 UTC m=+0.140283688 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team) Feb 20 03:53:45 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:53:45 localhost podman[105143]: 2026-02-20 08:53:45.244084437 +0000 UTC m=+0.179216252 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, tcib_managed=true, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1) Feb 20 03:53:45 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:53:45 localhost podman[105146]: 2026-02-20 08:53:45.265463708 +0000 UTC m=+0.192522044 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container) Feb 20 03:53:45 localhost podman[105142]: 2026-02-20 08:53:45.310102978 +0000 UTC m=+0.245065118 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64) Feb 20 03:53:45 localhost podman[105142]: 2026-02-20 08:53:45.343117338 +0000 UTC m=+0.278079468 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z) Feb 20 03:53:45 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:53:45 localhost podman[105144]: 2026-02-20 08:53:45.359508845 +0000 UTC m=+0.292905386 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:53:45 localhost podman[105144]: 2026-02-20 08:53:45.370623909 +0000 UTC m=+0.304020430 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Feb 20 03:53:45 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:53:45 localhost podman[105146]: 2026-02-20 08:53:45.496923564 +0000 UTC m=+0.423981880 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, tcib_managed=true, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public) Feb 20 03:53:45 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:53:52 localhost systemd[1]: tmp-crun.7bV8AD.mount: Deactivated successfully. Feb 20 03:53:52 localhost podman[105265]: 2026-02-20 08:53:52.135566234 +0000 UTC m=+0.066452006 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 20 03:53:52 localhost podman[105265]: 2026-02-20 08:53:52.16875176 +0000 UTC m=+0.099637562 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:53:52 localhost systemd[1]: tmp-crun.vQCrBl.mount: Deactivated successfully. Feb 20 03:53:52 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:53:52 localhost podman[105266]: 2026-02-20 08:53:52.188505131 +0000 UTC m=+0.111723616 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:53:52 localhost podman[105264]: 2026-02-20 08:53:52.216376483 +0000 UTC m=+0.146967576 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, version=17.1.13, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:53:52 localhost podman[105267]: 2026-02-20 08:53:52.233545063 +0000 UTC m=+0.153832397 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step5, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Feb 20 03:53:52 localhost podman[105264]: 2026-02-20 08:53:52.254562413 +0000 UTC m=+0.185153506 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=) Feb 20 03:53:52 localhost podman[105264]: unhealthy Feb 20 03:53:52 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:53:52 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:53:52 localhost podman[105267]: 2026-02-20 08:53:52.287931794 +0000 UTC m=+0.208219118 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:53:52 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:53:52 localhost podman[105266]: 2026-02-20 08:53:52.308263784 +0000 UTC m=+0.231482309 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, container_name=ovn_metadata_agent, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:53:52 localhost podman[105266]: unhealthy Feb 20 03:53:52 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:53:52 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:54:00 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:54:00 localhost recover_tripleo_nova_virtqemud[105348]: 63005 Feb 20 03:54:00 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:54:00 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:54:00 localhost systemd[1]: tmp-crun.kGfWlc.mount: Deactivated successfully. Feb 20 03:54:00 localhost podman[105346]: 2026-02-20 08:54:00.150193658 +0000 UTC m=+0.085255718 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container) Feb 20 03:54:00 localhost sshd[105368]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:54:00 localhost podman[105346]: 2026-02-20 08:54:00.487929539 +0000 UTC m=+0.422991609 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:54:00 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:54:16 localhost podman[105372]: 2026-02-20 08:54:16.157001602 +0000 UTC m=+0.093727979 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com) Feb 20 03:54:16 localhost systemd[1]: tmp-crun.niMvDW.mount: Deactivated successfully. Feb 20 03:54:16 localhost podman[105385]: 2026-02-20 08:54:16.192144738 +0000 UTC m=+0.110702293 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:54:16 localhost podman[105372]: 2026-02-20 08:54:16.198399672 +0000 UTC m=+0.135126119 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:54:16 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:54:16 localhost podman[105382]: 2026-02-20 08:54:16.222016822 +0000 UTC m=+0.148145401 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:54:16 localhost podman[105382]: 2026-02-20 08:54:16.250236415 +0000 UTC m=+0.176364944 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=) Feb 20 03:54:16 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:54:16 localhost podman[105373]: 2026-02-20 08:54:16.286800145 +0000 UTC m=+0.217052401 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, container_name=collectd, config_id=tripleo_step3, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:54:16 localhost podman[105371]: 2026-02-20 08:54:16.317794213 +0000 UTC m=+0.252822127 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:54:16 localhost podman[105371]: 2026-02-20 08:54:16.354068685 +0000 UTC m=+0.289096649 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:54:16 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:54:16 localhost podman[105373]: 2026-02-20 08:54:16.397070454 +0000 UTC m=+0.327322660 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:54:16 localhost podman[105385]: 2026-02-20 08:54:16.399885861 +0000 UTC m=+0.318443446 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd) Feb 20 03:54:16 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:54:16 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:54:23 localhost podman[105492]: 2026-02-20 08:54:23.151233567 +0000 UTC m=+0.082326886 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git) Feb 20 03:54:23 localhost systemd[1]: tmp-crun.Q5rnbw.mount: Deactivated successfully. Feb 20 03:54:23 localhost podman[105494]: 2026-02-20 08:54:23.211584723 +0000 UTC m=+0.138148222 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1) Feb 20 03:54:23 localhost podman[105492]: 2026-02-20 08:54:23.218456875 +0000 UTC m=+0.149550244 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 20 03:54:23 localhost podman[105492]: unhealthy Feb 20 03:54:23 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:54:23 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:54:23 localhost podman[105494]: 2026-02-20 08:54:23.230383405 +0000 UTC m=+0.156946904 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 20 03:54:23 localhost podman[105494]: unhealthy Feb 20 03:54:23 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:54:23 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:54:23 localhost podman[105493]: 2026-02-20 08:54:23.315462035 +0000 UTC m=+0.245186801 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc.) Feb 20 03:54:23 localhost podman[105495]: 2026-02-20 08:54:23.365339267 +0000 UTC m=+0.287205021 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:54:23 localhost podman[105493]: 2026-02-20 08:54:23.378260197 +0000 UTC m=+0.307984953 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510, distribution-scope=public, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com) Feb 20 03:54:23 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:54:23 localhost podman[105495]: 2026-02-20 08:54:23.398053489 +0000 UTC m=+0.319919223 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:54:23 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:54:31 localhost podman[105579]: 2026-02-20 08:54:31.154375436 +0000 UTC m=+0.089111006 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:54:31 localhost podman[105579]: 2026-02-20 08:54:31.524610472 +0000 UTC m=+0.459346082 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:54:31 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:54:43 localhost sshd[105680]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:54:47 localhost podman[105683]: 2026-02-20 08:54:47.170660993 +0000 UTC m=+0.106957269 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:54:47 localhost podman[105683]: 2026-02-20 08:54:47.213872858 +0000 UTC m=+0.150169174 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:54:47 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:54:47 localhost podman[105686]: 2026-02-20 08:54:47.216352075 +0000 UTC m=+0.143244330 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:54:47 localhost podman[105682]: 2026-02-20 08:54:47.267108524 +0000 UTC m=+0.202880323 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1766032510, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com) Feb 20 03:54:47 localhost podman[105684]: 2026-02-20 08:54:47.31451054 +0000 UTC m=+0.250148605 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:54:47 localhost podman[105684]: 2026-02-20 08:54:47.325854361 +0000 UTC m=+0.261492436 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, version=17.1.13, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:54:47 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:54:47 localhost podman[105685]: 2026-02-20 08:54:47.373200145 +0000 UTC m=+0.303057601 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:54:47 localhost podman[105682]: 2026-02-20 08:54:47.401282833 +0000 UTC m=+0.337054572 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:54:47 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:54:47 localhost podman[105686]: 2026-02-20 08:54:47.434013525 +0000 UTC m=+0.360905700 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1766032510, batch=17.1_20260112.1, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 20 03:54:47 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:54:47 localhost podman[105685]: 2026-02-20 08:54:47.453841708 +0000 UTC m=+0.383699204 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 20 03:54:47 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:54:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:54:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:54:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:54:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:54:54 localhost systemd[1]: tmp-crun.XGy0W9.mount: Deactivated successfully. Feb 20 03:54:54 localhost podman[105801]: 2026-02-20 08:54:54.168014352 +0000 UTC m=+0.097561806 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc.) Feb 20 03:54:54 localhost podman[105802]: 2026-02-20 08:54:54.199749194 +0000 UTC m=+0.127374189 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64) Feb 20 03:54:54 localhost podman[105802]: 2026-02-20 08:54:54.211985642 +0000 UTC m=+0.139610617 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true) Feb 20 03:54:54 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:54:54 localhost podman[105801]: 2026-02-20 08:54:54.249375628 +0000 UTC m=+0.178923042 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:54:54 localhost podman[105801]: unhealthy Feb 20 03:54:54 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:54:54 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:54:54 localhost podman[105804]: 2026-02-20 08:54:54.301790079 +0000 UTC m=+0.226376891 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:54:54 localhost podman[105803]: 2026-02-20 08:54:54.255678874 +0000 UTC m=+0.180329187 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:54:54 localhost podman[105804]: 2026-02-20 08:54:54.329080673 +0000 UTC m=+0.253667495 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step5, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:54:54 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully. Feb 20 03:54:54 localhost podman[105803]: 2026-02-20 08:54:54.386333712 +0000 UTC m=+0.310984085 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:54:54 localhost podman[105803]: unhealthy Feb 20 03:54:54 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:54:54 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:54:56 localhost sshd[105884]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:55:02 localhost systemd[1]: tmp-crun.j9gksB.mount: Deactivated successfully. Feb 20 03:55:02 localhost podman[105886]: 2026-02-20 08:55:02.137140209 +0000 UTC m=+0.076238548 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-type=git, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:55:02 localhost podman[105886]: 2026-02-20 08:55:02.482786996 +0000 UTC m=+0.421885345 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, release=1766032510, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5) Feb 20 03:55:02 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:55:18 localhost podman[105911]: 2026-02-20 08:55:18.146726759 +0000 UTC m=+0.084855975 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510) Feb 20 03:55:18 localhost podman[105910]: 2026-02-20 08:55:18.199972995 +0000 UTC m=+0.136264444 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:55:18 localhost podman[105910]: 2026-02-20 08:55:18.206842127 +0000 UTC m=+0.143133606 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:55:18 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:55:18 localhost podman[105909]: 2026-02-20 08:55:18.262958382 +0000 UTC m=+0.200528120 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:55:18 localhost podman[105912]: 2026-02-20 08:55:18.319248113 +0000 UTC m=+0.249564628 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 20 03:55:18 localhost podman[105913]: 2026-02-20 08:55:18.371041274 +0000 UTC m=+0.296196539 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.) Feb 20 03:55:18 localhost podman[105912]: 2026-02-20 08:55:18.380214347 +0000 UTC m=+0.310530822 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 20 03:55:18 localhost podman[105909]: 2026-02-20 08:55:18.393578761 +0000 UTC m=+0.331148489 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64) Feb 20 03:55:18 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:55:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:55:18 localhost podman[105911]: 2026-02-20 08:55:18.433196245 +0000 UTC m=+0.371325521 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 20 03:55:18 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:55:18 localhost podman[105913]: 2026-02-20 08:55:18.569739747 +0000 UTC m=+0.494894982 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vcs-type=git, config_id=tripleo_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:55:18 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:55:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:55:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:55:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:55:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:55:25 localhost podman[106032]: 2026-02-20 08:55:25.16029535 +0000 UTC m=+0.085226216 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:55:25 localhost podman[106032]: 2026-02-20 08:55:25.174400777 +0000 UTC m=+0.099331593 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:55:25 localhost podman[106032]: unhealthy Feb 20 03:55:25 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:55:25 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:55:25 localhost sshd[106080]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:55:25 localhost podman[106033]: 2026-02-20 08:55:25.259076334 +0000 UTC m=+0.183150484 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 20 03:55:25 localhost podman[106031]: 2026-02-20 08:55:25.228230501 +0000 UTC m=+0.157471680 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:55:25 localhost podman[106033]: 2026-02-20 08:55:25.307258104 +0000 UTC m=+0.231332264 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, config_id=tripleo_step5, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:55:25 localhost podman[106033]: unhealthy Feb 20 03:55:25 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:55:25 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:55:25 localhost podman[106030]: 2026-02-20 08:55:25.283024914 +0000 UTC m=+0.215818983 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5) Feb 20 03:55:25 localhost podman[106030]: 2026-02-20 08:55:25.362245494 +0000 UTC m=+0.295039563 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team) Feb 20 03:55:25 localhost podman[106030]: unhealthy Feb 20 03:55:25 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:55:25 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:55:25 localhost podman[106031]: 2026-02-20 08:55:25.415399058 +0000 UTC m=+0.344640227 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, tcib_managed=true) Feb 20 03:55:25 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:55:33 localhost podman[106113]: 2026-02-20 08:55:33.146729751 +0000 UTC m=+0.082830382 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:55:33 localhost podman[106113]: 2026-02-20 08:55:33.512068847 +0000 UTC m=+0.448169488 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 20 03:55:33 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:55:49 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:55:49 localhost recover_tripleo_nova_virtqemud[106250]: 63005 Feb 20 03:55:49 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:55:49 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:55:49 localhost podman[106215]: 2026-02-20 08:55:49.154945298 +0000 UTC m=+0.089395875 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible) Feb 20 03:55:49 localhost systemd[1]: tmp-crun.VcPrvl.mount: Deactivated successfully. Feb 20 03:55:49 localhost podman[106217]: 2026-02-20 08:55:49.174072999 +0000 UTC m=+0.101708326 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step3) Feb 20 03:55:49 localhost podman[106215]: 2026-02-20 08:55:49.2090573 +0000 UTC m=+0.143507877 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team) Feb 20 03:55:49 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:55:49 localhost podman[106229]: 2026-02-20 08:55:49.228726379 +0000 UTC m=+0.149608217 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=metrics_qdr, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64) Feb 20 03:55:49 localhost podman[106217]: 2026-02-20 08:55:49.236332934 +0000 UTC m=+0.163968291 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, container_name=collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:55:49 localhost podman[106223]: 2026-02-20 08:55:49.273210814 +0000 UTC m=+0.194130013 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible) Feb 20 03:55:49 localhost podman[106216]: 2026-02-20 08:55:49.316155521 +0000 UTC m=+0.246974796 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container) Feb 20 03:55:49 localhost podman[106216]: 2026-02-20 08:55:49.323315123 +0000 UTC m=+0.254134408 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:55:49 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:55:49 localhost podman[106223]: 2026-02-20 08:55:49.349980617 +0000 UTC m=+0.270899826 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:55:49 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:55:49 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:55:49 localhost podman[106229]: 2026-02-20 08:55:49.444198281 +0000 UTC m=+0.365080119 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container) Feb 20 03:55:49 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:55:56 localhost podman[106341]: 2026-02-20 08:55:56.169744508 +0000 UTC m=+0.088697184 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, version=17.1.13, container_name=nova_compute, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:55:56 localhost podman[106341]: 2026-02-20 08:55:56.198120795 +0000 UTC m=+0.117073241 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.) Feb 20 03:55:56 localhost podman[106341]: unhealthy Feb 20 03:55:56 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:55:56 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:55:56 localhost podman[106339]: 2026-02-20 08:55:56.214965266 +0000 UTC m=+0.140141964 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:55:56 localhost podman[106338]: 2026-02-20 08:55:56.254253461 +0000 UTC m=+0.182410271 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ovn_controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Feb 20 03:55:56 localhost podman[106340]: 2026-02-20 08:55:56.273285509 +0000 UTC m=+0.195556547 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z) Feb 20 03:55:56 localhost podman[106339]: 2026-02-20 08:55:56.27945215 +0000 UTC m=+0.204628838 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:55:56 localhost podman[106340]: 2026-02-20 08:55:56.289114468 +0000 UTC m=+0.211385506 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:55:56 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:55:56 localhost podman[106340]: unhealthy Feb 20 03:55:56 localhost podman[106338]: 2026-02-20 08:55:56.300201381 +0000 UTC m=+0.228358191 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:55:56 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:55:56 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:55:56 localhost podman[106338]: unhealthy Feb 20 03:55:56 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:55:56 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:55:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45345 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D019F0000000001030307) Feb 20 03:55:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45346 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D05A90000000001030307) Feb 20 03:55:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45347 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D0DA80000000001030307) Feb 20 03:56:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20220 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D11840000000001030307) Feb 20 03:56:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13616 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D11E00000000001030307) Feb 20 03:56:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20221 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D15A80000000001030307) Feb 20 03:56:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13617 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D15E80000000001030307) Feb 20 03:56:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45348 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D1D680000000001030307) Feb 20 03:56:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20222 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D1DA80000000001030307) Feb 20 03:56:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13618 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D1DE90000000001030307) Feb 20 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:56:04 localhost systemd[1]: tmp-crun.khjE9P.mount: Deactivated successfully. Feb 20 03:56:04 localhost podman[106417]: 2026-02-20 08:56:04.161365981 +0000 UTC m=+0.098077473 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, version=17.1.13) Feb 20 03:56:04 localhost podman[106417]: 2026-02-20 08:56:04.561346256 +0000 UTC m=+0.498057708 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=nova_migration_target, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:56:04 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:56:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12485 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D27A40000000001030307) Feb 20 03:56:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12486 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D2BA80000000001030307) Feb 20 03:56:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20223 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D2D680000000001030307) Feb 20 03:56:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13619 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D2DA80000000001030307) Feb 20 03:56:07 localhost sshd[106440]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12487 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D33A80000000001030307) Feb 20 03:56:09 localhost sshd[106442]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45349 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D3D680000000001030307) Feb 20 03:56:13 localhost sshd[106444]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:13 localhost systemd-logind[759]: New session 37 of user zuul. Feb 20 03:56:13 localhost systemd[1]: Started Session 37 of User zuul. Feb 20 03:56:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12488 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D43680000000001030307) Feb 20 03:56:13 localhost sshd[106480]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:14 localhost python3.9[106541]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 03:56:14 localhost python3.9[106635]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 03:56:15 localhost python3.9[106728]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 03:56:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20224 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D4D680000000001030307) Feb 20 03:56:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13620 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D4D690000000001030307) Feb 20 03:56:16 localhost python3.9[106822]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 03:56:16 localhost python3.9[106915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 03:56:17 localhost python3.9[107006]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Feb 20 03:56:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58880 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D542A0000000001030307) Feb 20 03:56:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58881 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D58280000000001030307) Feb 20 03:56:19 localhost python3.9[107096]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 03:56:19 localhost python3.9[107188]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Feb 20 03:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:56:20 localhost systemd[1]: tmp-crun.b04IBC.mount: Deactivated successfully. Feb 20 03:56:20 localhost podman[107205]: 2026-02-20 08:56:20.17130425 +0000 UTC m=+0.094560325 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 20 03:56:20 localhost systemd[1]: tmp-crun.I0wQVC.mount: Deactivated successfully. Feb 20 03:56:20 localhost podman[107205]: 2026-02-20 08:56:20.208805349 +0000 UTC m=+0.132061444 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Feb 20 03:56:20 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:56:20 localhost podman[107204]: 2026-02-20 08:56:20.259795106 +0000 UTC m=+0.188392836 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible) Feb 20 03:56:20 localhost podman[107203]: 2026-02-20 08:56:20.2207896 +0000 UTC m=+0.148284156 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 20 03:56:20 localhost podman[107203]: 2026-02-20 08:56:20.30523792 +0000 UTC m=+0.232732516 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, vcs-type=git, config_id=tripleo_step4) Feb 20 03:56:20 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:56:20 localhost podman[107213]: 2026-02-20 08:56:20.325605441 +0000 UTC m=+0.243397107 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc.) Feb 20 03:56:20 localhost podman[107206]: 2026-02-20 08:56:20.377149234 +0000 UTC m=+0.295619861 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 20 03:56:20 localhost podman[107204]: 2026-02-20 08:56:20.397345678 +0000 UTC m=+0.325943398 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:56:20 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:56:20 localhost podman[107206]: 2026-02-20 08:56:20.439994967 +0000 UTC m=+0.358465594 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, release=1766032510, url=https://www.redhat.com, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 20 03:56:20 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:56:20 localhost podman[107213]: 2026-02-20 08:56:20.515961106 +0000 UTC m=+0.433752772 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc.) Feb 20 03:56:20 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:56:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58882 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D60290000000001030307) Feb 20 03:56:20 localhost python3.9[107396]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 03:56:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12489 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D63680000000001030307) Feb 20 03:56:21 localhost python3.9[107444]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 03:56:22 localhost systemd[1]: session-37.scope: Deactivated successfully. Feb 20 03:56:22 localhost systemd[1]: session-37.scope: Consumed 4.904s CPU time. Feb 20 03:56:22 localhost systemd-logind[759]: Session 37 logged out. Waiting for processes to exit. Feb 20 03:56:22 localhost systemd-logind[759]: Removed session 37. Feb 20 03:56:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58883 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D6FE90000000001030307) Feb 20 03:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:56:27 localhost podman[107462]: 2026-02-20 08:56:27.172894594 +0000 UTC m=+0.090177639 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:56:27 localhost podman[107461]: 2026-02-20 08:56:27.223069435 +0000 UTC m=+0.142691792 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 20 03:56:27 localhost systemd[1]: tmp-crun.XR3Mn6.mount: Deactivated successfully. Feb 20 03:56:27 localhost podman[107461]: 2026-02-20 08:56:27.264045662 +0000 UTC m=+0.183668049 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:56:27 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:56:27 localhost podman[107460]: 2026-02-20 08:56:27.278938133 +0000 UTC m=+0.203105271 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:36:40Z, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:56:27 localhost podman[107462]: 2026-02-20 08:56:27.295541876 +0000 UTC m=+0.212824931 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 03:56:27 localhost podman[107462]: unhealthy Feb 20 03:56:27 localhost podman[107468]: 2026-02-20 08:56:27.254507857 +0000 UTC m=+0.166118136 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=nova_compute, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:56:27 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:56:27 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:56:27 localhost podman[107468]: 2026-02-20 08:56:27.337981518 +0000 UTC m=+0.249591807 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:56:27 localhost podman[107468]: unhealthy Feb 20 03:56:27 localhost podman[107460]: 2026-02-20 08:56:27.347998567 +0000 UTC m=+0.272165715 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:56:27 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:56:27 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:56:27 localhost podman[107460]: unhealthy Feb 20 03:56:27 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:56:27 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:56:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55666 DF PROTO=TCP SPT=57422 DPT=9105 SEQ=2464786171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D7AE80000000001030307) Feb 20 03:56:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55667 DF PROTO=TCP SPT=57422 DPT=9105 SEQ=2464786171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D82E80000000001030307) Feb 20 03:56:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58884 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D8F680000000001030307) Feb 20 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:56:35 localhost systemd[1]: tmp-crun.DIRMOS.mount: Deactivated successfully. Feb 20 03:56:35 localhost podman[107540]: 2026-02-20 08:56:35.14777912 +0000 UTC m=+0.088581040 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13) Feb 20 03:56:35 localhost podman[107540]: 2026-02-20 08:56:35.529554553 +0000 UTC m=+0.470356463 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510) Feb 20 03:56:35 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:56:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18700 DF PROTO=TCP SPT=57424 DPT=9101 SEQ=775098770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D9CD50000000001030307) Feb 20 03:56:39 localhost sshd[107563]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18702 DF PROTO=TCP SPT=57424 DPT=9101 SEQ=775098770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DA8E80000000001030307) Feb 20 03:56:39 localhost systemd-logind[759]: New session 38 of user zuul. Feb 20 03:56:39 localhost systemd[1]: Started Session 38 of User zuul. Feb 20 03:56:40 localhost python3.9[107658]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 03:56:40 localhost systemd[1]: Reloading. Feb 20 03:56:40 localhost systemd-rc-local-generator[107680]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:56:40 localhost systemd-sysv-generator[107685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:56:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:56:40 localhost systemd[1]: Starting dnf makecache... Feb 20 03:56:41 localhost dnf[107694]: Updating Subscription Management repositories. Feb 20 03:56:41 localhost python3.9[107784]: ansible-ansible.builtin.service_facts Invoked Feb 20 03:56:41 localhost network[107801]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 03:56:41 localhost network[107802]: 'network-scripts' will be removed from distribution in near future. Feb 20 03:56:41 localhost network[107803]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 03:56:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55669 DF PROTO=TCP SPT=57422 DPT=9105 SEQ=2464786171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DB3690000000001030307) Feb 20 03:56:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:56:43 localhost dnf[107694]: Metadata cache refreshed recently. Feb 20 03:56:43 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 20 03:56:43 localhost systemd[1]: Finished dnf makecache. Feb 20 03:56:43 localhost systemd[1]: dnf-makecache.service: Consumed 2.180s CPU time. Feb 20 03:56:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64317 DF PROTO=TCP SPT=38472 DPT=9100 SEQ=192635383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DC3690000000001030307) Feb 20 03:56:46 localhost sshd[108002]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46349 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DC95A0000000001030307) Feb 20 03:56:48 localhost python3.9[108079]: ansible-ansible.builtin.service_facts Invoked Feb 20 03:56:48 localhost network[108096]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 03:56:48 localhost network[108097]: 'network-scripts' will be removed from distribution in near future. Feb 20 03:56:48 localhost network[108098]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 03:56:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:56:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:56:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:56:50 localhost podman[108142]: 2026-02-20 08:56:50.372105128 +0000 UTC m=+0.084984258 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, architecture=x86_64, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 20 03:56:50 localhost podman[108142]: 2026-02-20 08:56:50.385030888 +0000 UTC m=+0.097910048 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public) Feb 20 03:56:50 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:56:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:56:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:56:50 localhost podman[108161]: 2026-02-20 08:56:50.456690623 +0000 UTC m=+0.093059558 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:56:50 localhost podman[108161]: 2026-02-20 08:56:50.50380646 +0000 UTC m=+0.140175385 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, version=17.1.13) Feb 20 03:56:50 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:56:50 localhost podman[108183]: 2026-02-20 08:56:50.519016491 +0000 UTC m=+0.079118088 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:56:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:56:50 localhost systemd[1]: tmp-crun.lTkmTm.mount: Deactivated successfully. Feb 20 03:56:50 localhost podman[108195]: 2026-02-20 08:56:50.589969404 +0000 UTC m=+0.127080409 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z) Feb 20 03:56:50 localhost podman[108183]: 2026-02-20 08:56:50.603140232 +0000 UTC m=+0.163241829 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:10:15Z, tcib_managed=true, architecture=x86_64) Feb 20 03:56:50 localhost podman[108195]: 2026-02-20 08:56:50.618795015 +0000 UTC m=+0.155905960 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:56:50 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:56:50 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully. Feb 20 03:56:50 localhost podman[108233]: 2026-02-20 08:56:50.740806718 +0000 UTC m=+0.176882520 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 20 03:56:50 localhost podman[108233]: 2026-02-20 08:56:50.921671189 +0000 UTC m=+0.357746951 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 20 03:56:50 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:56:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46351 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DD5680000000001030307) Feb 20 03:56:52 localhost python3.9[108419]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:56:52 localhost systemd[1]: Reloading. Feb 20 03:56:52 localhost systemd-sysv-generator[108446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:56:52 localhost systemd-rc-local-generator[108443]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:56:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:56:52 localhost systemd[1]: Stopping ceilometer_agent_compute container... Feb 20 03:56:53 localhost sshd[108472]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:56:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46352 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DE5290000000001030307) Feb 20 03:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:56:57 localhost podman[108474]: 2026-02-20 08:56:57.389744897 +0000 UTC m=+0.082596804 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vcs-type=git) Feb 20 03:56:57 localhost podman[108474]: 2026-02-20 08:56:57.402987307 +0000 UTC m=+0.095839264 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid) Feb 20 03:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:56:57 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:56:57 localhost systemd[1]: tmp-crun.2jZBwG.mount: Deactivated successfully. Feb 20 03:56:57 localhost podman[108495]: 2026-02-20 08:56:57.505474616 +0000 UTC m=+0.082749200 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:56:57 localhost podman[108493]: 2026-02-20 08:56:57.553333906 +0000 UTC m=+0.138535105 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container) Feb 20 03:56:57 localhost podman[108495]: 2026-02-20 08:56:57.556594466 +0000 UTC m=+0.133869060 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:56:57 localhost podman[108495]: unhealthy Feb 20 03:56:57 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:56:57 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:56:57 localhost podman[108493]: 2026-02-20 08:56:57.570131385 +0000 UTC m=+0.155332584 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:56:57 localhost podman[108493]: unhealthy Feb 20 03:56:57 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:56:57 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:56:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2092 DF PROTO=TCP SPT=54016 DPT=9105 SEQ=1705066968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DF0280000000001030307) Feb 20 03:56:57 localhost podman[108494]: 2026-02-20 08:56:57.652729188 +0000 UTC m=+0.233477249 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:56:57 localhost podman[108494]: 2026-02-20 08:56:57.670917731 +0000 UTC m=+0.251665812 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z) Feb 20 03:56:57 localhost podman[108494]: unhealthy Feb 20 03:56:57 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:56:57 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:56:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2093 DF PROTO=TCP SPT=54016 DPT=9105 SEQ=1705066968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DF8280000000001030307) Feb 20 03:57:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46353 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E05680000000001030307) Feb 20 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:57:06 localhost podman[108554]: 2026-02-20 08:57:06.128481328 +0000 UTC m=+0.064940299 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 20 03:57:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62194 DF PROTO=TCP SPT=60452 DPT=9101 SEQ=2944394580 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E12050000000001030307) Feb 20 03:57:06 localhost podman[108554]: 2026-02-20 08:57:06.536352718 +0000 UTC m=+0.472811769 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 20 03:57:06 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:57:09 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:57:09 localhost recover_tripleo_nova_virtqemud[108579]: 63005 Feb 20 03:57:09 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:57:09 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:57:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62196 DF PROTO=TCP SPT=60452 DPT=9101 SEQ=2944394580 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E1E280000000001030307) Feb 20 03:57:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2095 DF PROTO=TCP SPT=54016 DPT=9105 SEQ=1705066968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E27680000000001030307) Feb 20 03:57:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62481 DF PROTO=TCP SPT=44394 DPT=9102 SEQ=242983024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E37680000000001030307) Feb 20 03:57:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47325 DF PROTO=TCP SPT=57518 DPT=9882 SEQ=346104544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E3E8A0000000001030307) Feb 20 03:57:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:57:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:57:20 localhost podman[108581]: 2026-02-20 08:57:20.628286738 +0000 UTC m=+0.065095815 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, release=1766032510, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:57:20 localhost podman[108581]: 2026-02-20 08:57:20.638521074 +0000 UTC m=+0.075330151 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Feb 20 03:57:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:57:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:57:20 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:57:20 localhost podman[108610]: 2026-02-20 08:57:20.750731633 +0000 UTC m=+0.087426884 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:57:20 localhost podman[108580]: 2026-02-20 08:57:20.71214027 +0000 UTC m=+0.146408028 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64) Feb 20 03:57:20 localhost podman[108610]: 2026-02-20 08:57:20.78717582 +0000 UTC m=+0.123871031 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Feb 20 03:57:20 localhost podman[108580]: 2026-02-20 08:57:20.796084465 +0000 UTC m=+0.230352253 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:57:20 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:57:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47327 DF PROTO=TCP SPT=57518 DPT=9882 SEQ=346104544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E4AA80000000001030307) Feb 20 03:57:20 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully. Feb 20 03:57:20 localhost podman[108611]: Error: container cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a is not running Feb 20 03:57:20 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Main process exited, code=exited, status=125/n/a Feb 20 03:57:20 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed with result 'exit-code'. Feb 20 03:57:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:57:21 localhost podman[108656]: 2026-02-20 08:57:21.158826381 +0000 UTC m=+0.090163508 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510) Feb 20 03:57:21 localhost podman[108656]: 2026-02-20 08:57:21.358987869 +0000 UTC m=+0.290324986 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:57:21 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:57:21 localhost systemd[1]: tmp-crun.ICmI5p.mount: Deactivated successfully. Feb 20 03:57:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47328 DF PROTO=TCP SPT=57518 DPT=9882 SEQ=346104544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E5A690000000001030307) Feb 20 03:57:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22162 DF PROTO=TCP SPT=57974 DPT=9105 SEQ=2104025892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E65280000000001030307) Feb 20 03:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:57:27 localhost systemd[1]: tmp-crun.qGHC2T.mount: Deactivated successfully. Feb 20 03:57:27 localhost systemd[1]: tmp-crun.N6PzfT.mount: Deactivated successfully. Feb 20 03:57:27 localhost podman[108693]: 2026-02-20 08:57:27.913605305 +0000 UTC m=+0.085730431 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 20 03:57:27 localhost podman[108684]: 2026-02-20 08:57:27.894244036 +0000 UTC m=+0.080822410 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, url=https://www.redhat.com) Feb 20 03:57:27 localhost podman[108686]: 2026-02-20 08:57:27.951053673 +0000 UTC m=+0.128338709 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:57:27 localhost podman[108693]: 2026-02-20 08:57:27.980158233 +0000 UTC m=+0.152283389 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 20 03:57:27 localhost podman[108693]: unhealthy Feb 20 03:57:27 localhost podman[108686]: 2026-02-20 08:57:27.991674779 +0000 UTC m=+0.168959795 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 20 03:57:27 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:57:27 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:57:28 localhost podman[108686]: unhealthy Feb 20 03:57:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:57:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:57:28 localhost podman[108684]: 2026-02-20 08:57:28.024503914 +0000 UTC m=+0.211082228 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public) Feb 20 03:57:28 localhost podman[108684]: unhealthy Feb 20 03:57:28 localhost podman[108685]: 2026-02-20 08:57:27.996654213 +0000 UTC m=+0.176749616 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, container_name=iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:57:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:57:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:57:28 localhost podman[108685]: 2026-02-20 08:57:28.07550346 +0000 UTC m=+0.255598863 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z) Feb 20 03:57:28 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:57:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22163 DF PROTO=TCP SPT=57974 DPT=9105 SEQ=2104025892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E6D280000000001030307) Feb 20 03:57:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47391 DF PROTO=TCP SPT=60244 DPT=9100 SEQ=760201816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E79680000000001030307) Feb 20 03:57:34 localhost podman[108459]: time="2026-02-20T08:57:34Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Feb 20 03:57:34 localhost systemd[1]: libpod-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope: Deactivated successfully. Feb 20 03:57:34 localhost systemd[1]: libpod-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope: Consumed 6.248s CPU time. Feb 20 03:57:34 localhost podman[108459]: 2026-02-20 08:57:34.859176644 +0000 UTC m=+42.097371613 container died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container) Feb 20 03:57:34 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: Deactivated successfully. Feb 20 03:57:34 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a. Feb 20 03:57:34 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: No such file or directory Feb 20 03:57:34 localhost systemd[1]: tmp-crun.9R1hAS.mount: Deactivated successfully. Feb 20 03:57:34 localhost podman[108459]: 2026-02-20 08:57:34.924087541 +0000 UTC m=+42.162282480 container cleanup cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true) Feb 20 03:57:34 localhost podman[108459]: ceilometer_agent_compute Feb 20 03:57:34 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: No such file or directory Feb 20 03:57:34 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: No such file or directory Feb 20 03:57:34 localhost podman[108762]: 2026-02-20 08:57:34.952095157 +0000 UTC m=+0.080891982 container cleanup cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 03:57:34 localhost systemd[1]: libpod-conmon-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope: Deactivated successfully. Feb 20 03:57:35 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: No such file or directory Feb 20 03:57:35 localhost systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: No such file or directory Feb 20 03:57:35 localhost podman[108778]: 2026-02-20 08:57:35.063657746 +0000 UTC m=+0.069386916 container cleanup cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true) Feb 20 03:57:35 localhost podman[108778]: ceilometer_agent_compute Feb 20 03:57:35 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Feb 20 03:57:35 localhost systemd[1]: Stopped ceilometer_agent_compute container. Feb 20 03:57:35 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.091s CPU time, no IO. Feb 20 03:57:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:d7:b4:4a MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45034 SEQ=671084412 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 20 03:57:35 localhost python3.9[108881]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:57:35 localhost systemd[1]: var-lib-containers-storage-overlay-e1c07b1bd08758bd14fb80cc901f6da6a3ccc5e5eba94f04ead08e95db5f3037-merged.mount: Deactivated successfully. Feb 20 03:57:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a-userdata-shm.mount: Deactivated successfully. Feb 20 03:57:35 localhost systemd[1]: Reloading. Feb 20 03:57:35 localhost systemd-rc-local-generator[108906]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:57:35 localhost systemd-sysv-generator[108912]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:57:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:57:36 localhost sshd[108920]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:57:36 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Feb 20 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:57:37 localhost podman[108938]: 2026-02-20 08:57:37.144949504 +0000 UTC m=+0.083855573 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510) Feb 20 03:57:37 localhost podman[108938]: 2026-02-20 08:57:37.517398509 +0000 UTC m=+0.456304518 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public) Feb 20 03:57:37 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:57:38 localhost sshd[108961]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:57:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59499 DF PROTO=TCP SPT=51882 DPT=9101 SEQ=2614739063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E93290000000001030307) Feb 20 03:57:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22165 DF PROTO=TCP SPT=57974 DPT=9105 SEQ=2104025892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E9D680000000001030307) Feb 20 03:57:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20144 DF PROTO=TCP SPT=55476 DPT=9100 SEQ=250879534 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EAD680000000001030307) Feb 20 03:57:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35038 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EB3BA0000000001030307) Feb 20 03:57:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35040 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EBFA80000000001030307) Feb 20 03:57:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:57:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:57:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:57:51 localhost podman[109041]: 2026-02-20 08:57:51.152549408 +0000 UTC m=+0.080410897 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 20 03:57:51 localhost podman[109041]: 2026-02-20 08:57:51.165075445 +0000 UTC m=+0.092936924 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, container_name=collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 20 03:57:51 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:57:51 localhost systemd[1]: tmp-crun.ANVzJn.mount: Deactivated successfully. Feb 20 03:57:51 localhost podman[109040]: 2026-02-20 08:57:51.207073113 +0000 UTC m=+0.139861695 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 20 03:57:51 localhost podman[109039]: Error: container 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 is not running Feb 20 03:57:51 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Main process exited, code=exited, status=125/n/a Feb 20 03:57:51 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed with result 'exit-code'. Feb 20 03:57:51 localhost podman[109040]: 2026-02-20 08:57:51.241463236 +0000 UTC m=+0.174251848 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron) Feb 20 03:57:51 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:57:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:57:52 localhost podman[109086]: 2026-02-20 08:57:52.12976986 +0000 UTC m=+0.075288088 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git) Feb 20 03:57:52 localhost podman[109086]: 2026-02-20 08:57:52.337828483 +0000 UTC m=+0.283346701 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public) Feb 20 03:57:52 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:57:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35041 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597ECF680000000001030307) Feb 20 03:57:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11800 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=1323325226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EDA680000000001030307) Feb 20 03:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:57:58 localhost systemd[1]: tmp-crun.v1oroh.mount: Deactivated successfully. Feb 20 03:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:57:58 localhost podman[109116]: 2026-02-20 08:57:58.157240983 +0000 UTC m=+0.093787240 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:57:58 localhost podman[109121]: 2026-02-20 08:57:58.171011809 +0000 UTC m=+0.093772380 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4) Feb 20 03:57:58 localhost podman[109117]: 2026-02-20 08:57:58.207208328 +0000 UTC m=+0.138731040 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64) Feb 20 03:57:58 localhost podman[109116]: 2026-02-20 08:57:58.229993143 +0000 UTC m=+0.166539380 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 03:57:58 localhost podman[109116]: unhealthy Feb 20 03:57:58 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:57:58 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:57:58 localhost podman[109121]: 2026-02-20 08:57:58.264761707 +0000 UTC m=+0.187522268 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public) Feb 20 03:57:58 localhost podman[109121]: unhealthy Feb 20 03:57:58 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:57:58 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:57:58 localhost podman[109159]: 2026-02-20 08:57:58.235710459 +0000 UTC m=+0.073000198 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, container_name=iscsid, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 20 03:57:58 localhost podman[109159]: 2026-02-20 08:57:58.317107936 +0000 UTC m=+0.154397665 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 20 03:57:58 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully. Feb 20 03:57:58 localhost podman[109117]: 2026-02-20 08:57:58.33826544 +0000 UTC m=+0.269788132 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, managed_by=tripleo_ansible) Feb 20 03:57:58 localhost podman[109117]: unhealthy Feb 20 03:57:58 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:57:58 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:57:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11801 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=1323325226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EE2690000000001030307) Feb 20 03:58:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35042 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EEF690000000001030307) Feb 20 03:58:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17671 DF PROTO=TCP SPT=60500 DPT=9101 SEQ=3302469427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EFC650000000001030307) Feb 20 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:58:07 localhost systemd[1]: tmp-crun.c47BBy.mount: Deactivated successfully. Feb 20 03:58:07 localhost podman[109200]: 2026-02-20 08:58:07.902818844 +0000 UTC m=+0.093916836 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 20 03:58:08 localhost podman[109200]: 2026-02-20 08:58:08.24329353 +0000 UTC m=+0.434391592 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Feb 20 03:58:08 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:58:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17673 DF PROTO=TCP SPT=60500 DPT=9101 SEQ=3302469427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F08680000000001030307) Feb 20 03:58:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11803 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=1323325226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F13680000000001030307) Feb 20 03:58:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51596 DF PROTO=TCP SPT=58240 DPT=9102 SEQ=1064857520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F23680000000001030307) Feb 20 03:58:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56057 DF PROTO=TCP SPT=48320 DPT=9882 SEQ=4293003028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F28EA0000000001030307) Feb 20 03:58:18 localhost podman[108923]: time="2026-02-20T08:58:18Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Feb 20 03:58:18 localhost systemd[1]: libpod-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope: Deactivated successfully. Feb 20 03:58:18 localhost systemd[1]: libpod-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope: Consumed 6.508s CPU time. Feb 20 03:58:18 localhost podman[108923]: 2026-02-20 08:58:18.312714485 +0000 UTC m=+42.098986532 container died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git) Feb 20 03:58:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: Deactivated successfully. Feb 20 03:58:18 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5. Feb 20 03:58:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: No such file or directory Feb 20 03:58:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5-userdata-shm.mount: Deactivated successfully. Feb 20 03:58:18 localhost systemd[1]: var-lib-containers-storage-overlay-271fbe47d50a90f03735a26a1ff5b20e2027c13cb6e9d5c8a6a9112793cd7c92-merged.mount: Deactivated successfully. Feb 20 03:58:18 localhost podman[108923]: 2026-02-20 08:58:18.394020829 +0000 UTC m=+42.180292826 container cleanup 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Feb 20 03:58:18 localhost podman[108923]: ceilometer_agent_ipmi Feb 20 03:58:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: No such file or directory Feb 20 03:58:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: No such file or directory Feb 20 03:58:18 localhost podman[109223]: 2026-02-20 08:58:18.439598758 +0000 UTC m=+0.120218318 container cleanup 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:58:18 localhost systemd[1]: libpod-conmon-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope: Deactivated successfully. Feb 20 03:58:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: No such file or directory Feb 20 03:58:18 localhost systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: No such file or directory Feb 20 03:58:18 localhost podman[109239]: 2026-02-20 08:58:18.527439035 +0000 UTC m=+0.060248425 container cleanup 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 20 03:58:18 localhost podman[109239]: ceilometer_agent_ipmi Feb 20 03:58:18 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Feb 20 03:58:18 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Feb 20 03:58:19 localhost python3.9[109342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:19 localhost systemd[1]: Reloading. Feb 20 03:58:19 localhost systemd-sysv-generator[109370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:58:19 localhost systemd-rc-local-generator[109366]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:58:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:58:19 localhost systemd[1]: Stopping collectd container... Feb 20 03:58:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56059 DF PROTO=TCP SPT=48320 DPT=9882 SEQ=4293003028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F34E80000000001030307) Feb 20 03:58:21 localhost systemd[1]: libpod-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope: Deactivated successfully. Feb 20 03:58:21 localhost systemd[1]: libpod-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope: Consumed 2.254s CPU time. Feb 20 03:58:21 localhost podman[109382]: 2026-02-20 08:58:21.159713868 +0000 UTC m=+1.475591993 container stop 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 20 03:58:21 localhost podman[109382]: 2026-02-20 08:58:21.194924817 +0000 UTC m=+1.510802992 container died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Feb 20 03:58:21 localhost systemd[1]: tmp-crun.3p0MX5.mount: Deactivated successfully. Feb 20 03:58:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:58:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: Deactivated successfully. Feb 20 03:58:21 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:58:21 localhost systemd[1]: Stopping /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e... Feb 20 03:58:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully. Feb 20 03:58:21 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e. Feb 20 03:58:21 localhost systemd[1]: tmp-crun.8xGWsa.mount: Deactivated successfully. Feb 20 03:58:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:58:21 localhost podman[109382]: 2026-02-20 08:58:21.281264917 +0000 UTC m=+1.597143042 container cleanup 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd) Feb 20 03:58:21 localhost podman[109382]: collectd Feb 20 03:58:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: No such file or directory Feb 20 03:58:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: No such file or directory Feb 20 03:58:21 localhost podman[109394]: 2026-02-20 08:58:21.310418778 +0000 UTC m=+0.130406413 container cleanup 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 20 03:58:21 localhost systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:58:21 localhost systemd[1]: libpod-conmon-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope: Deactivated successfully. Feb 20 03:58:21 localhost podman[109410]: 2026-02-20 08:58:21.362796228 +0000 UTC m=+0.078706705 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron) Feb 20 03:58:21 localhost podman[109446]: error opening file `/run/crun/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e/status`: No such file or directory Feb 20 03:58:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: No such file or directory Feb 20 03:58:21 localhost systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: No such file or directory Feb 20 03:58:21 localhost podman[109429]: 2026-02-20 08:58:21.422234985 +0000 UTC m=+0.078516728 container cleanup 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 20 03:58:21 localhost podman[109429]: collectd Feb 20 03:58:21 localhost systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'. Feb 20 03:58:21 localhost systemd[1]: Stopped collectd container. Feb 20 03:58:21 localhost podman[109410]: 2026-02-20 08:58:21.452245093 +0000 UTC m=+0.168155600 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 20 03:58:21 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully. Feb 20 03:58:22 localhost python3.9[109539]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:22 localhost systemd[1]: var-lib-containers-storage-overlay-27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7-merged.mount: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e-userdata-shm.mount: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: Reloading. Feb 20 03:58:22 localhost systemd-sysv-generator[109570]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:58:22 localhost systemd-rc-local-generator[109565]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:58:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:58:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:58:22 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 03:58:22 localhost systemd[1]: Stopping iscsid container... Feb 20 03:58:22 localhost recover_tripleo_nova_virtqemud[109586]: 63005 Feb 20 03:58:22 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 03:58:22 localhost systemd[1]: libpod-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: libpod-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope: Consumed 1.091s CPU time. Feb 20 03:58:22 localhost podman[109582]: 2026-02-20 08:58:22.689995861 +0000 UTC m=+0.084536185 container died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Feb 20 03:58:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008. Feb 20 03:58:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: No such file or directory Feb 20 03:58:22 localhost podman[109582]: 2026-02-20 08:58:22.784252835 +0000 UTC m=+0.178793109 container cleanup 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.13, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 20 03:58:22 localhost podman[109582]: iscsid Feb 20 03:58:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: No such file or directory Feb 20 03:58:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: No such file or directory Feb 20 03:58:22 localhost podman[109605]: 2026-02-20 08:58:22.796347199 +0000 UTC m=+0.094469731 container cleanup 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:58:22 localhost podman[109579]: 2026-02-20 08:58:22.757845409 +0000 UTC m=+0.156564271 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-type=git, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 03:58:22 localhost systemd[1]: libpod-conmon-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: No such file or directory Feb 20 03:58:22 localhost systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: No such file or directory Feb 20 03:58:22 localhost podman[109641]: 2026-02-20 08:58:22.912055027 +0000 UTC m=+0.085634779 container cleanup 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 20 03:58:22 localhost podman[109641]: iscsid Feb 20 03:58:22 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Feb 20 03:58:22 localhost systemd[1]: Stopped iscsid container. Feb 20 03:58:22 localhost podman[109579]: 2026-02-20 08:58:22.983010971 +0000 UTC m=+0.381729883 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, release=1766032510, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 20 03:58:22 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully. Feb 20 03:58:23 localhost systemd[1]: tmp-crun.tPntMm.mount: Deactivated successfully. Feb 20 03:58:23 localhost systemd[1]: var-lib-containers-storage-overlay-6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d-merged.mount: Deactivated successfully. Feb 20 03:58:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008-userdata-shm.mount: Deactivated successfully. Feb 20 03:58:23 localhost python3.9[109745]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:23 localhost systemd[1]: Reloading. Feb 20 03:58:23 localhost systemd-rc-local-generator[109769]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:58:23 localhost systemd-sysv-generator[109775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:58:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:58:24 localhost sshd[109784]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:58:24 localhost systemd[1]: Stopping logrotate_crond container... Feb 20 03:58:24 localhost systemd[1]: libpod-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.scope: Deactivated successfully. Feb 20 03:58:24 localhost podman[109788]: 2026-02-20 08:58:24.218154398 +0000 UTC m=+0.053595217 container died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5) Feb 20 03:58:24 localhost systemd[1]: tmp-crun.ETMQTz.mount: Deactivated successfully. Feb 20 03:58:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: Deactivated successfully. Feb 20 03:58:24 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e. Feb 20 03:58:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: No such file or directory Feb 20 03:58:24 localhost podman[109788]: 2026-02-20 08:58:24.273212651 +0000 UTC m=+0.108653500 container cleanup 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510) Feb 20 03:58:24 localhost podman[109788]: logrotate_crond Feb 20 03:58:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: No such file or directory Feb 20 03:58:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: No such file or directory Feb 20 03:58:24 localhost podman[109802]: 2026-02-20 08:58:24.310995559 +0000 UTC m=+0.083805072 container cleanup 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 20 03:58:24 localhost systemd[1]: libpod-conmon-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.scope: Deactivated successfully. Feb 20 03:58:24 localhost podman[109831]: error opening file `/run/crun/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e/status`: No such file or directory Feb 20 03:58:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: No such file or directory Feb 20 03:58:24 localhost systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: No such file or directory Feb 20 03:58:24 localhost podman[109818]: 2026-02-20 08:58:24.403139128 +0000 UTC m=+0.068569742 container cleanup 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 03:58:24 localhost podman[109818]: logrotate_crond Feb 20 03:58:24 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Feb 20 03:58:24 localhost systemd[1]: Stopped logrotate_crond container. Feb 20 03:58:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56060 DF PROTO=TCP SPT=48320 DPT=9882 SEQ=4293003028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F44A80000000001030307) Feb 20 03:58:25 localhost python3.9[109924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:25 localhost systemd[1]: var-lib-containers-storage-overlay-0b03ed83be81af8ca31d355d34bc84741adbeedeb0b33580fe27349115e799d7-merged.mount: Deactivated successfully. Feb 20 03:58:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e-userdata-shm.mount: Deactivated successfully. Feb 20 03:58:25 localhost systemd[1]: Reloading. Feb 20 03:58:25 localhost systemd-rc-local-generator[109947]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:58:25 localhost systemd-sysv-generator[109950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:58:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:58:25 localhost systemd[1]: Stopping metrics_qdr container... Feb 20 03:58:25 localhost kernel: qdrouterd[55300]: segfault at 0 ip 00007fdcedc9e7cb sp 00007fff412a6c00 error 4 in libc.so.6[7fdcedc3b000+175000] Feb 20 03:58:25 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Feb 20 03:58:25 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Feb 20 03:58:25 localhost systemd[1]: Started Process Core Dump (PID 109977/UID 0). Feb 20 03:58:25 localhost systemd-coredump[109978]: Resource limits disable core dumping for process 55300 (qdrouterd). Feb 20 03:58:25 localhost systemd-coredump[109978]: Process 55300 (qdrouterd) of user 42465 dumped core. Feb 20 03:58:25 localhost systemd[1]: systemd-coredump@0-109977-0.service: Deactivated successfully. Feb 20 03:58:25 localhost podman[109965]: 2026-02-20 08:58:25.815581127 +0000 UTC m=+0.226337709 container died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, config_id=tripleo_step1, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=) Feb 20 03:58:25 localhost systemd[1]: libpod-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope: Deactivated successfully. Feb 20 03:58:25 localhost systemd[1]: libpod-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope: Consumed 28.009s CPU time. Feb 20 03:58:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: Deactivated successfully. Feb 20 03:58:25 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b. Feb 20 03:58:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: No such file or directory Feb 20 03:58:25 localhost systemd[1]: tmp-crun.TO2Bv5.mount: Deactivated successfully. Feb 20 03:58:25 localhost podman[109965]: 2026-02-20 08:58:25.861271589 +0000 UTC m=+0.272028121 container cleanup f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 20 03:58:25 localhost podman[109965]: metrics_qdr Feb 20 03:58:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: No such file or directory Feb 20 03:58:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: No such file or directory Feb 20 03:58:25 localhost podman[109982]: 2026-02-20 08:58:25.90267972 +0000 UTC m=+0.075307650 container cleanup f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 20 03:58:25 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Feb 20 03:58:25 localhost systemd[1]: libpod-conmon-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope: Deactivated successfully. Feb 20 03:58:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: No such file or directory Feb 20 03:58:25 localhost systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: No such file or directory Feb 20 03:58:26 localhost podman[109998]: 2026-02-20 08:58:26.000217685 +0000 UTC m=+0.068476057 container cleanup f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, vcs-type=git, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 20 03:58:26 localhost podman[109998]: metrics_qdr Feb 20 03:58:26 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Feb 20 03:58:26 localhost systemd[1]: Stopped metrics_qdr container. Feb 20 03:58:26 localhost systemd[1]: var-lib-containers-storage-overlay-748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a-merged.mount: Deactivated successfully. Feb 20 03:58:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b-userdata-shm.mount: Deactivated successfully. Feb 20 03:58:26 localhost python3.9[110101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:27 localhost python3.9[110194]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2528 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=4075530326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F4FA80000000001030307) Feb 20 03:58:28 localhost python3.9[110287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:58:28 localhost podman[110382]: 2026-02-20 08:58:28.649015971 +0000 UTC m=+0.097735513 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13) Feb 20 03:58:28 localhost podman[110382]: 2026-02-20 08:58:28.663159498 +0000 UTC m=+0.111879010 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git) Feb 20 03:58:28 localhost podman[110383]: 2026-02-20 08:58:28.690833263 +0000 UTC m=+0.131810637 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_compute, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 20 03:58:28 localhost podman[110381]: 2026-02-20 08:58:28.735616617 +0000 UTC m=+0.182697879 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 20 03:58:28 localhost podman[110383]: 2026-02-20 08:58:28.739147407 +0000 UTC m=+0.180124791 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5) Feb 20 03:58:28 localhost podman[110383]: unhealthy Feb 20 03:58:28 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:58:28 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:58:28 localhost podman[110382]: unhealthy Feb 20 03:58:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:58:28 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:58:28 localhost podman[110381]: 2026-02-20 08:58:28.778466803 +0000 UTC m=+0.225548115 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:58:28 localhost podman[110381]: unhealthy Feb 20 03:58:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:58:28 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:58:28 localhost python3.9[110380]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:58:28 localhost systemd[1]: Reloading. Feb 20 03:58:29 localhost systemd-rc-local-generator[110469]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:58:29 localhost systemd-sysv-generator[110475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:58:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:58:29 localhost systemd[1]: Stopping nova_compute container... Feb 20 03:58:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2529 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=4075530326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F57A80000000001030307) Feb 20 03:58:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51597 DF PROTO=TCP SPT=58240 DPT=9102 SEQ=1064857520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F63680000000001030307) Feb 20 03:58:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22181 DF PROTO=TCP SPT=42204 DPT=9101 SEQ=237021640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F71950000000001030307) Feb 20 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:58:38 localhost podman[110495]: 2026-02-20 08:58:38.649000417 +0000 UTC m=+0.084408121 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 03:58:39 localhost podman[110495]: 2026-02-20 08:58:39.02138018 +0000 UTC m=+0.456787884 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, release=1766032510, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 20 03:58:39 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:58:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22183 DF PROTO=TCP SPT=42204 DPT=9101 SEQ=237021640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F7DA80000000001030307) Feb 20 03:58:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2531 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=4075530326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F87680000000001030307) Feb 20 03:58:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:d7:b4:4a MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45046 SEQ=955136020 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 20 03:58:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10319 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F9E1A0000000001030307) Feb 20 03:58:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10321 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FAA280000000001030307) Feb 20 03:58:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10322 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FB9E90000000001030307) Feb 20 03:58:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51047 DF PROTO=TCP SPT=51900 DPT=9105 SEQ=2352136963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FC4E80000000001030307) Feb 20 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:58:58 localhost podman[110593]: 2026-02-20 08:58:58.899780323 +0000 UTC m=+0.086375732 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=) Feb 20 03:58:58 localhost podman[110594]: Error: container a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 is not running Feb 20 03:58:58 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=125/n/a Feb 20 03:58:58 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'. Feb 20 03:58:58 localhost podman[110593]: 2026-02-20 08:58:58.935684803 +0000 UTC m=+0.122280192 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 20 03:58:58 localhost podman[110593]: unhealthy Feb 20 03:58:58 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:58:58 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:58:59 localhost podman[110595]: 2026-02-20 08:58:59.0028472 +0000 UTC m=+0.186642242 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 03:58:59 localhost podman[110595]: 2026-02-20 08:58:59.041229217 +0000 UTC m=+0.225024279 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 20 03:58:59 localhost podman[110595]: unhealthy Feb 20 03:58:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:58:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:58:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51048 DF PROTO=TCP SPT=51900 DPT=9105 SEQ=2352136963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FCCE80000000001030307) Feb 20 03:59:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10323 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FD9680000000001030307) Feb 20 03:59:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9182 DF PROTO=TCP SPT=40754 DPT=9101 SEQ=255780391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FE6C50000000001030307) Feb 20 03:59:08 localhost sshd[110645]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:59:09 localhost podman[110647]: 2026-02-20 08:59:09.139313486 +0000 UTC m=+0.079258071 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 20 03:59:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9184 DF PROTO=TCP SPT=40754 DPT=9101 SEQ=255780391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FF2E80000000001030307) Feb 20 03:59:09 localhost podman[110647]: 2026-02-20 08:59:09.496059016 +0000 UTC m=+0.436003591 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:59:09 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully. Feb 20 03:59:11 localhost podman[110483]: time="2026-02-20T08:59:11Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Feb 20 03:59:11 localhost systemd[1]: session-c11.scope: Deactivated successfully. Feb 20 03:59:11 localhost systemd[1]: libpod-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope: Deactivated successfully. Feb 20 03:59:11 localhost systemd[1]: libpod-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope: Consumed 36.163s CPU time. Feb 20 03:59:11 localhost podman[110483]: 2026-02-20 08:59:11.34238018 +0000 UTC m=+42.100069068 container died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 20 03:59:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: Deactivated successfully. Feb 20 03:59:11 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380. Feb 20 03:59:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: No such file or directory Feb 20 03:59:11 localhost systemd[1]: tmp-crun.qPO3qi.mount: Deactivated successfully. Feb 20 03:59:11 localhost systemd[1]: var-lib-containers-storage-overlay-f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2-merged.mount: Deactivated successfully. Feb 20 03:59:11 localhost podman[110483]: 2026-02-20 08:59:11.400577878 +0000 UTC m=+42.158266736 container cleanup a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_id=tripleo_step5, io.buildah.version=1.41.5) Feb 20 03:59:11 localhost podman[110483]: nova_compute Feb 20 03:59:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: No such file or directory Feb 20 03:59:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: No such file or directory Feb 20 03:59:11 localhost podman[110670]: 2026-02-20 08:59:11.429215355 +0000 UTC m=+0.075825226 container cleanup a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 03:59:11 localhost systemd[1]: libpod-conmon-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope: Deactivated successfully. Feb 20 03:59:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: No such file or directory Feb 20 03:59:11 localhost systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: No such file or directory Feb 20 03:59:11 localhost podman[110686]: 2026-02-20 08:59:11.544992124 +0000 UTC m=+0.078239350 container cleanup a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1766032510, vcs-type=git) Feb 20 03:59:11 localhost podman[110686]: nova_compute Feb 20 03:59:11 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Feb 20 03:59:11 localhost systemd[1]: Stopped nova_compute container. Feb 20 03:59:11 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.074s CPU time, no IO. Feb 20 03:59:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51050 DF PROTO=TCP SPT=51900 DPT=9105 SEQ=2352136963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FFD680000000001030307) Feb 20 03:59:12 localhost python3.9[110789]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:59:12 localhost systemd[1]: Reloading. Feb 20 03:59:12 localhost systemd-sysv-generator[110817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:59:12 localhost systemd-rc-local-generator[110813]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:59:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:59:12 localhost systemd[1]: Stopping nova_migration_target container... Feb 20 03:59:12 localhost systemd[1]: libpod-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope: Deactivated successfully. Feb 20 03:59:12 localhost systemd[1]: libpod-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope: Consumed 34.496s CPU time. Feb 20 03:59:12 localhost podman[110830]: 2026-02-20 08:59:12.82246478 +0000 UTC m=+0.094400200 container died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute) Feb 20 03:59:12 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: Deactivated successfully. Feb 20 03:59:12 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601. Feb 20 03:59:12 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: No such file or directory Feb 20 03:59:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601-userdata-shm.mount: Deactivated successfully. Feb 20 03:59:12 localhost podman[110830]: 2026-02-20 08:59:12.867503213 +0000 UTC m=+0.139438603 container cleanup b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, release=1766032510, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 20 03:59:12 localhost podman[110830]: nova_migration_target Feb 20 03:59:12 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: No such file or directory Feb 20 03:59:12 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: No such file or directory Feb 20 03:59:12 localhost podman[110842]: 2026-02-20 08:59:12.902292588 +0000 UTC m=+0.065221607 container cleanup b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 20 03:59:12 localhost systemd[1]: libpod-conmon-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope: Deactivated successfully. Feb 20 03:59:13 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: No such file or directory Feb 20 03:59:13 localhost systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: No such file or directory Feb 20 03:59:13 localhost podman[110857]: 2026-02-20 08:59:13.015703085 +0000 UTC m=+0.070254333 container cleanup b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:59:13 localhost podman[110857]: nova_migration_target Feb 20 03:59:13 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Feb 20 03:59:13 localhost systemd[1]: Stopped nova_migration_target container. Feb 20 03:59:13 localhost python3.9[110959]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 03:59:13 localhost systemd[1]: var-lib-containers-storage-overlay-5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61-merged.mount: Deactivated successfully. Feb 20 03:59:13 localhost systemd[1]: Reloading. Feb 20 03:59:13 localhost systemd-rc-local-generator[110989]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 03:59:13 localhost systemd-sysv-generator[110993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 03:59:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 03:59:14 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Feb 20 03:59:14 localhost systemd[1]: libpod-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19.scope: Deactivated successfully. Feb 20 03:59:14 localhost podman[111000]: 2026-02-20 08:59:14.233739464 +0000 UTC m=+0.063843325 container died e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:31:49Z) Feb 20 03:59:14 localhost podman[111000]: 2026-02-20 08:59:14.277715853 +0000 UTC m=+0.107819604 container cleanup e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 03:59:14 localhost podman[111000]: nova_virtlogd_wrapper Feb 20 03:59:14 localhost podman[111013]: 2026-02-20 08:59:14.299574989 +0000 UTC m=+0.063078951 container cleanup e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible) Feb 20 03:59:14 localhost systemd[1]: var-lib-containers-storage-overlay-81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24-merged.mount: Deactivated successfully. Feb 20 03:59:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19-userdata-shm.mount: Deactivated successfully. Feb 20 03:59:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52752 DF PROTO=TCP SPT=55974 DPT=9100 SEQ=3057389671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59800D680000000001030307) Feb 20 03:59:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30174 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1359209721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980134A0000000001030307) Feb 20 03:59:18 localhost sshd[111030]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:59:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30176 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1359209721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59801F680000000001030307) Feb 20 03:59:21 localhost systemd[1]: Stopping User Manager for UID 0... Feb 20 03:59:21 localhost systemd[85653]: Activating special unit Exit the Session... Feb 20 03:59:21 localhost systemd[85653]: Removed slice User Background Tasks Slice. Feb 20 03:59:21 localhost systemd[85653]: Stopped target Main User Target. Feb 20 03:59:21 localhost systemd[85653]: Stopped target Basic System. Feb 20 03:59:21 localhost systemd[85653]: Stopped target Paths. Feb 20 03:59:21 localhost systemd[85653]: Stopped target Sockets. Feb 20 03:59:21 localhost systemd[85653]: Stopped target Timers. Feb 20 03:59:21 localhost systemd[85653]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 03:59:21 localhost systemd[85653]: Closed D-Bus User Message Bus Socket. Feb 20 03:59:21 localhost systemd[85653]: Stopped Create User's Volatile Files and Directories. Feb 20 03:59:21 localhost systemd[85653]: Removed slice User Application Slice. Feb 20 03:59:21 localhost systemd[85653]: Reached target Shutdown. Feb 20 03:59:21 localhost systemd[85653]: Finished Exit the Session. Feb 20 03:59:21 localhost systemd[85653]: Reached target Exit the Session. Feb 20 03:59:21 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 20 03:59:21 localhost systemd[1]: Stopped User Manager for UID 0. Feb 20 03:59:21 localhost systemd[1]: user@0.service: Consumed 3.926s CPU time, no IO. Feb 20 03:59:21 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 20 03:59:21 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 20 03:59:21 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 20 03:59:21 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 20 03:59:21 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 20 03:59:21 localhost systemd[1]: user-0.slice: Consumed 4.864s CPU time. Feb 20 03:59:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30177 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1359209721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59802F280000000001030307) Feb 20 03:59:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31478 DF PROTO=TCP SPT=55518 DPT=9105 SEQ=818947730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598039E80000000001030307) Feb 20 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:59:29 localhost podman[111034]: 2026-02-20 08:59:29.160873093 +0000 UTC m=+0.088646541 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 20 03:59:29 localhost podman[111033]: 2026-02-20 08:59:29.201977524 +0000 UTC m=+0.132673414 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.expose-services=) Feb 20 03:59:29 localhost podman[111034]: 2026-02-20 08:59:29.209018571 +0000 UTC m=+0.136791989 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 20 03:59:29 localhost podman[111034]: unhealthy Feb 20 03:59:29 localhost podman[111033]: 2026-02-20 08:59:29.222041614 +0000 UTC m=+0.152737554 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 20 03:59:29 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:59:29 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:59:29 localhost podman[111033]: unhealthy Feb 20 03:59:29 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:59:29 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:59:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31479 DF PROTO=TCP SPT=55518 DPT=9105 SEQ=818947730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598041E90000000001030307) Feb 20 03:59:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9196 DF PROTO=TCP SPT=56218 DPT=9102 SEQ=4037064552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59804D690000000001030307) Feb 20 03:59:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39615 DF PROTO=TCP SPT=44112 DPT=9101 SEQ=893233341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59805BF60000000001030307) Feb 20 03:59:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39617 DF PROTO=TCP SPT=44112 DPT=9101 SEQ=893233341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598067E80000000001030307) Feb 20 03:59:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31481 DF PROTO=TCP SPT=55518 DPT=9105 SEQ=818947730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598071680000000001030307) Feb 20 03:59:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52076 DF PROTO=TCP SPT=56640 DPT=9102 SEQ=380876730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598081690000000001030307) Feb 20 03:59:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:59:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:59:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10529 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980887A0000000001030307) Feb 20 03:59:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10531 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598094680000000001030307) Feb 20 03:59:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 03:59:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 03:59:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10532 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980A4280000000001030307) Feb 20 03:59:55 localhost sshd[111152]: main: sshd: ssh-rsa algorithm is disabled Feb 20 03:59:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5076 DF PROTO=TCP SPT=47872 DPT=9105 SEQ=1572668428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980AF280000000001030307) Feb 20 03:59:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 03:59:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 03:59:59 localhost podman[111154]: 2026-02-20 08:59:59.400326134 +0000 UTC m=+0.088321812 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 03:59:59 localhost podman[111154]: 2026-02-20 08:59:59.418776314 +0000 UTC m=+0.106772022 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Feb 20 03:59:59 localhost podman[111154]: unhealthy Feb 20 03:59:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:59:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 03:59:59 localhost podman[111155]: 2026-02-20 08:59:59.511040896 +0000 UTC m=+0.195942239 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent) Feb 20 03:59:59 localhost podman[111155]: 2026-02-20 08:59:59.524017128 +0000 UTC m=+0.208918521 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 03:59:59 localhost podman[111155]: unhealthy Feb 20 03:59:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 03:59:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 03:59:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5077 DF PROTO=TCP SPT=47872 DPT=9105 SEQ=1572668428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980B7280000000001030307) Feb 20 04:00:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10533 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980C5690000000001030307) Feb 20 04:00:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36737 DF PROTO=TCP SPT=57714 DPT=9101 SEQ=3142129976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980D1240000000001030307) Feb 20 04:00:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36739 DF PROTO=TCP SPT=57714 DPT=9101 SEQ=3142129976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980DD280000000001030307) Feb 20 04:00:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5079 DF PROTO=TCP SPT=47872 DPT=9105 SEQ=1572668428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980E7690000000001030307) Feb 20 04:00:15 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 20 04:00:15 localhost recover_tripleo_nova_virtqemud[111196]: 63005 Feb 20 04:00:15 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 20 04:00:15 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 20 04:00:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=278 DF PROTO=TCP SPT=33934 DPT=9100 SEQ=2071844979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980F7690000000001030307) Feb 20 04:00:16 localhost sshd[111197]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:00:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42199 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980FDAA0000000001030307) Feb 20 04:00:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42201 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598109A80000000001030307) Feb 20 04:00:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42202 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598119680000000001030307) Feb 20 04:00:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19165 DF PROTO=TCP SPT=42918 DPT=9105 SEQ=2861299012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598124680000000001030307) Feb 20 04:00:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 04:00:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 04:00:29 localhost podman[111200]: 2026-02-20 09:00:29.640821957 +0000 UTC m=+0.073204824 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent) Feb 20 04:00:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19166 DF PROTO=TCP SPT=42918 DPT=9105 SEQ=2861299012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59812C680000000001030307) Feb 20 04:00:29 localhost podman[111199]: 2026-02-20 09:00:29.704824046 +0000 UTC m=+0.137757830 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, release=1766032510, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 04:00:29 localhost podman[111199]: 2026-02-20 09:00:29.717915641 +0000 UTC m=+0.150849435 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible) Feb 20 04:00:29 localhost podman[111199]: unhealthy Feb 20 04:00:29 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:00:29 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 04:00:29 localhost podman[111200]: 2026-02-20 09:00:29.730972085 +0000 UTC m=+0.163354912 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public) Feb 20 04:00:29 localhost podman[111200]: unhealthy Feb 20 04:00:29 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:00:29 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 04:00:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42203 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598139680000000001030307) Feb 20 04:00:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54777 DF PROTO=TCP SPT=49322 DPT=9101 SEQ=2158845858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598146540000000001030307) Feb 20 04:00:38 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Feb 20 04:00:38 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 62225 (conmon) with signal SIGKILL. Feb 20 04:00:38 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Feb 20 04:00:38 localhost systemd[1]: libpod-conmon-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19.scope: Deactivated successfully. Feb 20 04:00:38 localhost podman[111251]: error opening file `/run/crun/e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19/status`: No such file or directory Feb 20 04:00:38 localhost systemd[1]: tmp-crun.xCxWUS.mount: Deactivated successfully. Feb 20 04:00:38 localhost podman[111239]: 2026-02-20 09:00:38.398126425 +0000 UTC m=+0.082351676 container cleanup e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:00:38 localhost podman[111239]: nova_virtlogd_wrapper Feb 20 04:00:38 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Feb 20 04:00:38 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Feb 20 04:00:39 localhost python3.9[111345]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:00:39 localhost systemd[1]: Reloading. Feb 20 04:00:39 localhost systemd-sysv-generator[111378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:00:39 localhost systemd-rc-local-generator[111372]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:00:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:00:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54779 DF PROTO=TCP SPT=49322 DPT=9101 SEQ=2158845858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598152680000000001030307) Feb 20 04:00:39 localhost systemd[1]: Stopping nova_virtnodedevd container... Feb 20 04:00:39 localhost systemd[1]: libpod-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope: Deactivated successfully. Feb 20 04:00:39 localhost systemd[1]: libpod-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope: Consumed 1.496s CPU time. Feb 20 04:00:39 localhost podman[111386]: 2026-02-20 09:00:39.620714085 +0000 UTC m=+0.056520868 container died b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 20 04:00:39 localhost podman[111386]: 2026-02-20 09:00:39.659817254 +0000 UTC m=+0.095624057 container cleanup b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 20 04:00:39 localhost podman[111386]: nova_virtnodedevd Feb 20 04:00:39 localhost podman[111399]: 2026-02-20 09:00:39.712424011 +0000 UTC m=+0.076056593 container cleanup b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, container_name=nova_virtnodedevd, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 04:00:39 localhost systemd[1]: libpod-conmon-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope: Deactivated successfully. Feb 20 04:00:39 localhost podman[111428]: error opening file `/run/crun/b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1/status`: No such file or directory Feb 20 04:00:39 localhost podman[111416]: 2026-02-20 09:00:39.80198342 +0000 UTC m=+0.064814696 container cleanup b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, batch=17.1_20260112.1) Feb 20 04:00:39 localhost podman[111416]: nova_virtnodedevd Feb 20 04:00:39 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Feb 20 04:00:39 localhost systemd[1]: Stopped nova_virtnodedevd container. Feb 20 04:00:40 localhost python3.9[111521]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:00:40 localhost systemd[1]: Reloading. Feb 20 04:00:40 localhost systemd-sysv-generator[111555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:00:40 localhost systemd-rc-local-generator[111551]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:00:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:00:40 localhost systemd[1]: var-lib-containers-storage-overlay-c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d-merged.mount: Deactivated successfully. Feb 20 04:00:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1-userdata-shm.mount: Deactivated successfully. Feb 20 04:00:40 localhost systemd[1]: Stopping nova_virtproxyd container... Feb 20 04:00:41 localhost systemd[1]: libpod-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34.scope: Deactivated successfully. Feb 20 04:00:41 localhost podman[111562]: 2026-02-20 09:00:41.036926731 +0000 UTC m=+0.083284695 container died e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, container_name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc.) Feb 20 04:00:41 localhost podman[111562]: 2026-02-20 09:00:41.081444317 +0000 UTC m=+0.127802281 container cleanup e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, container_name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 04:00:41 localhost podman[111562]: nova_virtproxyd Feb 20 04:00:41 localhost podman[111577]: 2026-02-20 09:00:41.11385802 +0000 UTC m=+0.068715776 container cleanup e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, distribution-scope=public, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtproxyd, managed_by=tripleo_ansible) Feb 20 04:00:41 localhost systemd[1]: libpod-conmon-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34.scope: Deactivated successfully. Feb 20 04:00:41 localhost podman[111606]: error opening file `/run/crun/e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34/status`: No such file or directory Feb 20 04:00:41 localhost podman[111595]: 2026-02-20 09:00:41.208698492 +0000 UTC m=+0.057736256 container cleanup e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtproxyd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com) Feb 20 04:00:41 localhost podman[111595]: nova_virtproxyd Feb 20 04:00:41 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Feb 20 04:00:41 localhost systemd[1]: Stopped nova_virtproxyd container. Feb 20 04:00:41 localhost sshd[111608]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:00:41 localhost systemd[1]: tmp-crun.mmFv9A.mount: Deactivated successfully. Feb 20 04:00:41 localhost systemd[1]: var-lib-containers-storage-overlay-fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe-merged.mount: Deactivated successfully. Feb 20 04:00:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34-userdata-shm.mount: Deactivated successfully. Feb 20 04:00:41 localhost python3.9[111701]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:00:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19168 DF PROTO=TCP SPT=42918 DPT=9105 SEQ=2861299012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59815D680000000001030307) Feb 20 04:00:43 localhost systemd[1]: Reloading. Feb 20 04:00:43 localhost systemd-rc-local-generator[111729]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:00:43 localhost systemd-sysv-generator[111733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:00:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:00:43 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Feb 20 04:00:43 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Feb 20 04:00:43 localhost systemd[1]: Stopping nova_virtqemud container... Feb 20 04:00:43 localhost systemd[1]: libpod-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope: Deactivated successfully. Feb 20 04:00:43 localhost systemd[1]: libpod-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope: Consumed 2.746s CPU time. Feb 20 04:00:43 localhost podman[111742]: 2026-02-20 09:00:43.437958195 +0000 UTC m=+0.074970469 container died 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, container_name=nova_virtqemud, managed_by=tripleo_ansible, tcib_managed=true) Feb 20 04:00:43 localhost podman[111742]: 2026-02-20 09:00:43.462853514 +0000 UTC m=+0.099865728 container cleanup 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=nova_virtqemud, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 20 04:00:43 localhost podman[111742]: nova_virtqemud Feb 20 04:00:43 localhost podman[111755]: 2026-02-20 09:00:43.51575483 +0000 UTC m=+0.064700611 container cleanup 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt) Feb 20 04:00:44 localhost systemd[1]: var-lib-containers-storage-overlay-fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260-merged.mount: Deactivated successfully. Feb 20 04:00:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69-userdata-shm.mount: Deactivated successfully. Feb 20 04:00:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59488 DF PROTO=TCP SPT=39078 DPT=9102 SEQ=3151336319 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59816B680000000001030307) Feb 20 04:00:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37037 DF PROTO=TCP SPT=38760 DPT=9882 SEQ=89716857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598172DA0000000001030307) Feb 20 04:00:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37039 DF PROTO=TCP SPT=38760 DPT=9882 SEQ=89716857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59817EE80000000001030307) Feb 20 04:00:53 localhost sshd[111831]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:00:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37040 DF PROTO=TCP SPT=38760 DPT=9882 SEQ=89716857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59818EA80000000001030307) Feb 20 04:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52129 DF PROTO=TCP SPT=45840 DPT=9105 SEQ=470536505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598199680000000001030307) Feb 20 04:00:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52130 DF PROTO=TCP SPT=45840 DPT=9105 SEQ=470536505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981A1690000000001030307) Feb 20 04:00:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 04:00:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 04:00:59 localhost systemd[1]: tmp-crun.OKadpE.mount: Deactivated successfully. Feb 20 04:00:59 localhost podman[111849]: 2026-02-20 09:00:59.913113079 +0000 UTC m=+0.097346801 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 04:00:59 localhost podman[111849]: 2026-02-20 09:00:59.934071737 +0000 UTC m=+0.118305529 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 04:00:59 localhost podman[111849]: unhealthy Feb 20 04:00:59 localhost systemd[1]: tmp-crun.FsmNeX.mount: Deactivated successfully. Feb 20 04:00:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:00:59 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 04:00:59 localhost podman[111848]: 2026-02-20 09:00:59.955285603 +0000 UTC m=+0.139870175 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4) Feb 20 04:00:59 localhost podman[111848]: 2026-02-20 09:00:59.969860204 +0000 UTC m=+0.154444776 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 04:00:59 localhost podman[111848]: unhealthy Feb 20 04:00:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:00:59 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 04:01:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19335 DF PROTO=TCP SPT=53684 DPT=9100 SEQ=2478429227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981AD680000000001030307) Feb 20 04:01:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5605 DF PROTO=TCP SPT=42846 DPT=9101 SEQ=3351586781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981BB840000000001030307) Feb 20 04:01:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5607 DF PROTO=TCP SPT=42846 DPT=9101 SEQ=3351586781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981C7A80000000001030307) Feb 20 04:01:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52132 DF PROTO=TCP SPT=45840 DPT=9105 SEQ=470536505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981D1680000000001030307) Feb 20 04:01:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=319 DF PROTO=TCP SPT=34466 DPT=9100 SEQ=2865431081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981E1680000000001030307) Feb 20 04:01:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2780 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981E80A0000000001030307) Feb 20 04:01:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2782 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981F4280000000001030307) Feb 20 04:01:24 localhost sshd[111900]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:01:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2783 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598203E80000000001030307) Feb 20 04:01:25 localhost sshd[111902]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:01:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18927 DF PROTO=TCP SPT=49522 DPT=9105 SEQ=155240626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59820EA80000000001030307) Feb 20 04:01:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18928 DF PROTO=TCP SPT=49522 DPT=9105 SEQ=155240626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598216A80000000001030307) Feb 20 04:01:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 04:01:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 04:01:30 localhost podman[111904]: 2026-02-20 09:01:30.158005061 +0000 UTC m=+0.090653273 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 04:01:30 localhost podman[111904]: 2026-02-20 09:01:30.172466358 +0000 UTC m=+0.105114570 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 20 04:01:30 localhost podman[111904]: unhealthy Feb 20 04:01:30 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:01:30 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 04:01:30 localhost podman[111905]: 2026-02-20 09:01:30.260587063 +0000 UTC m=+0.189024755 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 20 04:01:30 localhost podman[111905]: 2026-02-20 09:01:30.275065861 +0000 UTC m=+0.203503533 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4) Feb 20 04:01:30 localhost podman[111905]: unhealthy Feb 20 04:01:30 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:01:30 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 04:01:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2784 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598223680000000001030307) Feb 20 04:01:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62345 DF PROTO=TCP SPT=48546 DPT=9101 SEQ=1091139209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598230B40000000001030307) Feb 20 04:01:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62347 DF PROTO=TCP SPT=48546 DPT=9101 SEQ=1091139209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59823CA80000000001030307) Feb 20 04:01:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18930 DF PROTO=TCP SPT=49522 DPT=9105 SEQ=155240626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598247680000000001030307) Feb 20 04:01:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61083 DF PROTO=TCP SPT=44978 DPT=9102 SEQ=1875276320 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598257680000000001030307) Feb 20 04:01:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5633 DF PROTO=TCP SPT=44844 DPT=9882 SEQ=2640199640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59825D3A0000000001030307) Feb 20 04:01:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5635 DF PROTO=TCP SPT=44844 DPT=9882 SEQ=2640199640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598269280000000001030307) Feb 20 04:01:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5636 DF PROTO=TCP SPT=44844 DPT=9882 SEQ=2640199640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598278E80000000001030307) Feb 20 04:01:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51558 DF PROTO=TCP SPT=39554 DPT=9105 SEQ=2002411649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598283E80000000001030307) Feb 20 04:01:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51559 DF PROTO=TCP SPT=39554 DPT=9105 SEQ=2002411649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59828BE80000000001030307) Feb 20 04:02:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 04:02:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 04:02:00 localhost systemd[1]: tmp-crun.mQoQL5.mount: Deactivated successfully. Feb 20 04:02:00 localhost podman[112072]: 2026-02-20 09:02:00.653618986 +0000 UTC m=+0.086107294 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Feb 20 04:02:00 localhost podman[112072]: 2026-02-20 09:02:00.700138143 +0000 UTC m=+0.132626531 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 04:02:00 localhost podman[112072]: unhealthy Feb 20 04:02:00 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:02:00 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'. Feb 20 04:02:00 localhost podman[112071]: 2026-02-20 09:02:00.743522175 +0000 UTC m=+0.178113338 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, release=1766032510, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller) Feb 20 04:02:00 localhost podman[112071]: 2026-02-20 09:02:00.78798197 +0000 UTC m=+0.222573143 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4) Feb 20 04:02:00 localhost podman[112071]: unhealthy Feb 20 04:02:00 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:02:00 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'. Feb 20 04:02:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63526 DF PROTO=TCP SPT=58314 DPT=9100 SEQ=3974477460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598297680000000001030307) Feb 20 04:02:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14847 DF PROTO=TCP SPT=39812 DPT=9101 SEQ=709151425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982A5E70000000001030307) Feb 20 04:02:07 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Feb 20 04:02:07 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 63001 (conmon) with signal SIGKILL. Feb 20 04:02:07 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Feb 20 04:02:07 localhost systemd[1]: libpod-conmon-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope: Deactivated successfully. Feb 20 04:02:07 localhost podman[112124]: error opening file `/run/crun/0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69/status`: No such file or directory Feb 20 04:02:07 localhost podman[112111]: 2026-02-20 09:02:07.609687539 +0000 UTC m=+0.049602635 container cleanup 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 20 04:02:07 localhost podman[112111]: nova_virtqemud Feb 20 04:02:07 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Feb 20 04:02:07 localhost systemd[1]: Stopped nova_virtqemud container. Feb 20 04:02:08 localhost python3.9[112217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:02:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14849 DF PROTO=TCP SPT=39812 DPT=9101 SEQ=709151425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982B1E80000000001030307) Feb 20 04:02:09 localhost systemd[1]: Reloading. Feb 20 04:02:09 localhost systemd-rc-local-generator[112244]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:02:09 localhost systemd-sysv-generator[112249]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:02:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:02:09 localhost sshd[112255]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:02:10 localhost python3.9[112348]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:02:10 localhost systemd[1]: Reloading. Feb 20 04:02:10 localhost systemd-rc-local-generator[112372]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:02:10 localhost systemd-sysv-generator[112376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:02:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:02:10 localhost systemd[1]: Stopping nova_virtsecretd container... Feb 20 04:02:10 localhost systemd[1]: tmp-crun.hwfvmZ.mount: Deactivated successfully. Feb 20 04:02:10 localhost systemd[1]: libpod-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2.scope: Deactivated successfully. Feb 20 04:02:10 localhost podman[112389]: 2026-02-20 09:02:10.934434943 +0000 UTC m=+0.079496129 container died c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtsecretd) Feb 20 04:02:10 localhost podman[112389]: 2026-02-20 09:02:10.975351648 +0000 UTC m=+0.120412814 container cleanup c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1) Feb 20 04:02:10 localhost podman[112389]: nova_virtsecretd Feb 20 04:02:10 localhost podman[112402]: 2026-02-20 09:02:10.994310764 +0000 UTC m=+0.051864664 container cleanup c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, release=1766032510, container_name=nova_virtsecretd, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64) Feb 20 04:02:11 localhost systemd[1]: libpod-conmon-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2.scope: Deactivated successfully. Feb 20 04:02:11 localhost podman[112429]: error opening file `/run/crun/c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2/status`: No such file or directory Feb 20 04:02:11 localhost podman[112418]: 2026-02-20 09:02:11.08926535 +0000 UTC m=+0.063898047 container cleanup c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 04:02:11 localhost podman[112418]: nova_virtsecretd Feb 20 04:02:11 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Feb 20 04:02:11 localhost systemd[1]: Stopped nova_virtsecretd container. Feb 20 04:02:11 localhost python3.9[112524]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:02:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51561 DF PROTO=TCP SPT=39554 DPT=9105 SEQ=2002411649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982BB680000000001030307) Feb 20 04:02:11 localhost systemd[1]: Reloading. Feb 20 04:02:11 localhost systemd-rc-local-generator[112551]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:02:11 localhost systemd-sysv-generator[112556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:02:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:02:12 localhost systemd[1]: var-lib-containers-storage-overlay-a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95-merged.mount: Deactivated successfully. Feb 20 04:02:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2-userdata-shm.mount: Deactivated successfully. Feb 20 04:02:12 localhost systemd[1]: Stopping nova_virtstoraged container... Feb 20 04:02:12 localhost systemd[1]: libpod-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108.scope: Deactivated successfully. Feb 20 04:02:12 localhost podman[112565]: 2026-02-20 09:02:12.261401889 +0000 UTC m=+0.077580249 container died 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtstoraged, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=) Feb 20 04:02:12 localhost podman[112565]: 2026-02-20 09:02:12.302568642 +0000 UTC m=+0.118746992 container cleanup 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, architecture=x86_64, container_name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3) Feb 20 04:02:12 localhost podman[112565]: nova_virtstoraged Feb 20 04:02:12 localhost podman[112582]: 2026-02-20 09:02:12.342891349 +0000 UTC m=+0.070406908 container cleanup 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtstoraged) Feb 20 04:02:12 localhost systemd[1]: libpod-conmon-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108.scope: Deactivated successfully. Feb 20 04:02:12 localhost podman[112609]: error opening file `/run/crun/025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108/status`: No such file or directory Feb 20 04:02:12 localhost podman[112598]: 2026-02-20 09:02:12.443333044 +0000 UTC m=+0.068759396 container cleanup 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=nova_virtstoraged, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container) Feb 20 04:02:12 localhost podman[112598]: nova_virtstoraged Feb 20 04:02:12 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Feb 20 04:02:12 localhost systemd[1]: Stopped nova_virtstoraged container. Feb 20 04:02:13 localhost systemd[1]: tmp-crun.4g6Y7y.mount: Deactivated successfully. Feb 20 04:02:13 localhost systemd[1]: var-lib-containers-storage-overlay-bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5-merged.mount: Deactivated successfully. Feb 20 04:02:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108-userdata-shm.mount: Deactivated successfully. Feb 20 04:02:13 localhost python3.9[112702]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:02:13 localhost systemd[1]: Reloading. Feb 20 04:02:13 localhost systemd-rc-local-generator[112732]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:02:13 localhost systemd-sysv-generator[112735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:02:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:02:13 localhost systemd[1]: Stopping ovn_controller container... Feb 20 04:02:13 localhost systemd[1]: libpod-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope: Deactivated successfully. Feb 20 04:02:13 localhost systemd[1]: libpod-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope: Consumed 2.699s CPU time. Feb 20 04:02:13 localhost podman[112744]: 2026-02-20 09:02:13.641286552 +0000 UTC m=+0.080740407 container died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13) Feb 20 04:02:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: Deactivated successfully. Feb 20 04:02:13 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850. Feb 20 04:02:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: No such file or directory Feb 20 04:02:13 localhost podman[112744]: 2026-02-20 09:02:13.680183985 +0000 UTC m=+0.119637820 container cleanup 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 20 04:02:13 localhost podman[112744]: ovn_controller Feb 20 04:02:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: No such file or directory Feb 20 04:02:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: No such file or directory Feb 20 04:02:13 localhost podman[112758]: 2026-02-20 09:02:13.698200222 +0000 UTC m=+0.050882234 container cleanup 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public) Feb 20 04:02:13 localhost systemd[1]: libpod-conmon-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope: Deactivated successfully. Feb 20 04:02:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: No such file or directory Feb 20 04:02:13 localhost systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: No such file or directory Feb 20 04:02:13 localhost podman[112773]: 2026-02-20 09:02:13.786233694 +0000 UTC m=+0.060419449 container cleanup 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 20 04:02:13 localhost podman[112773]: ovn_controller Feb 20 04:02:13 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Feb 20 04:02:13 localhost systemd[1]: Stopped ovn_controller container. Feb 20 04:02:14 localhost systemd[1]: var-lib-containers-storage-overlay-9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c-merged.mount: Deactivated successfully. Feb 20 04:02:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850-userdata-shm.mount: Deactivated successfully. Feb 20 04:02:14 localhost python3.9[112877]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:02:14 localhost systemd[1]: Reloading. Feb 20 04:02:14 localhost systemd-rc-local-generator[112905]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:02:14 localhost systemd-sysv-generator[112910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:02:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:02:14 localhost systemd[1]: Stopping ovn_metadata_agent container... Feb 20 04:02:15 localhost systemd[1]: libpod-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope: Deactivated successfully. Feb 20 04:02:15 localhost systemd[1]: libpod-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope: Consumed 11.444s CPU time. Feb 20 04:02:15 localhost podman[112917]: 2026-02-20 09:02:15.26491023 +0000 UTC m=+0.369471813 container died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 20 04:02:15 localhost systemd[1]: tmp-crun.UqVjfM.mount: Deactivated successfully. Feb 20 04:02:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: Deactivated successfully. Feb 20 04:02:15 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59. Feb 20 04:02:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: No such file or directory Feb 20 04:02:15 localhost sshd[112936]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:02:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59-userdata-shm.mount: Deactivated successfully. Feb 20 04:02:15 localhost podman[112917]: 2026-02-20 09:02:15.339991112 +0000 UTC m=+0.444552645 container cleanup 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 04:02:15 localhost podman[112917]: ovn_metadata_agent Feb 20 04:02:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: No such file or directory Feb 20 04:02:15 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: No such file or directory Feb 20 04:02:15 localhost podman[112929]: 2026-02-20 09:02:15.364140269 +0000 UTC m=+0.094545465 container cleanup 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 20 04:02:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13410 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=1882484095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982CB680000000001030307) Feb 20 04:02:16 localhost systemd[1]: var-lib-containers-storage-overlay-d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50-merged.mount: Deactivated successfully. Feb 20 04:02:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25599 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982D26A0000000001030307) Feb 20 04:02:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25601 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982DE680000000001030307) Feb 20 04:02:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25602 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982EE280000000001030307) Feb 20 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15112 DF PROTO=TCP SPT=55516 DPT=9105 SEQ=3645250186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982F9280000000001030307) Feb 20 04:02:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15113 DF PROTO=TCP SPT=55516 DPT=9105 SEQ=3645250186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598301280000000001030307) Feb 20 04:02:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25603 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59830F680000000001030307) Feb 20 04:02:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19912 DF PROTO=TCP SPT=50874 DPT=9101 SEQ=1703013491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59831B150000000001030307) Feb 20 04:02:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19914 DF PROTO=TCP SPT=50874 DPT=9101 SEQ=1703013491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598327280000000001030307) Feb 20 04:02:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15115 DF PROTO=TCP SPT=55516 DPT=9105 SEQ=3645250186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598331680000000001030307) Feb 20 04:02:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28307 DF PROTO=TCP SPT=33502 DPT=9100 SEQ=1954362177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598341680000000001030307) Feb 20 04:02:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61440 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983479B0000000001030307) Feb 20 04:02:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61442 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598353A80000000001030307) Feb 20 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61443 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598363690000000001030307) Feb 20 04:02:55 localhost sshd[112950]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:02:57 localhost sshd[113014]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14898 DF PROTO=TCP SPT=33676 DPT=9105 SEQ=2657743949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59836E290000000001030307) Feb 20 04:02:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14899 DF PROTO=TCP SPT=33676 DPT=9105 SEQ=2657743949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598376280000000001030307) Feb 20 04:03:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61444 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598383680000000001030307) Feb 20 04:03:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61647 DF PROTO=TCP SPT=47776 DPT=9101 SEQ=1524675634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598390450000000001030307) Feb 20 04:03:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61649 DF PROTO=TCP SPT=47776 DPT=9101 SEQ=1524675634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59839C680000000001030307) Feb 20 04:03:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14901 DF PROTO=TCP SPT=33676 DPT=9105 SEQ=2657743949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983A5680000000001030307) Feb 20 04:03:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32104 DF PROTO=TCP SPT=55654 DPT=9102 SEQ=654630034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983B5680000000001030307) Feb 20 04:03:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6701 DF PROTO=TCP SPT=39064 DPT=9882 SEQ=1798188793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983BCCA0000000001030307) Feb 20 04:03:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6703 DF PROTO=TCP SPT=39064 DPT=9882 SEQ=1798188793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983C8E80000000001030307) Feb 20 04:03:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6704 DF PROTO=TCP SPT=39064 DPT=9882 SEQ=1798188793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983D8A80000000001030307) Feb 20 04:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57140 DF PROTO=TCP SPT=34470 DPT=9105 SEQ=2681173294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983E3680000000001030307) Feb 20 04:03:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57141 DF PROTO=TCP SPT=34470 DPT=9105 SEQ=2681173294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983EB680000000001030307) Feb 20 04:03:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24433 DF PROTO=TCP SPT=41248 DPT=9100 SEQ=437128031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983F7690000000001030307) Feb 20 04:03:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27182 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3234130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598405750000000001030307) Feb 20 04:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27184 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3234130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598411680000000001030307) Feb 20 04:03:39 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Feb 20 04:03:39 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 72703 (conmon) with signal SIGKILL. Feb 20 04:03:39 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Feb 20 04:03:39 localhost systemd[1]: libpod-conmon-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope: Deactivated successfully. Feb 20 04:03:39 localhost podman[113044]: error opening file `/run/crun/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59/status`: No such file or directory Feb 20 04:03:39 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: No such file or directory Feb 20 04:03:39 localhost systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: No such file or directory Feb 20 04:03:39 localhost podman[113031]: 2026-02-20 09:03:39.658677334 +0000 UTC m=+0.087925999 container cleanup 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 20 04:03:39 localhost podman[113031]: ovn_metadata_agent Feb 20 04:03:39 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Feb 20 04:03:39 localhost systemd[1]: Stopped ovn_metadata_agent container. Feb 20 04:03:40 localhost python3.9[113139]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:03:40 localhost systemd[1]: Reloading. Feb 20 04:03:40 localhost systemd-rc-local-generator[113169]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:03:40 localhost systemd-sysv-generator[113172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:03:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:03:41 localhost sshd[113192]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:03:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57143 DF PROTO=TCP SPT=34470 DPT=9105 SEQ=2681173294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59841B680000000001030307) Feb 20 04:03:42 localhost python3.9[113271]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:43 localhost python3.9[113363]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:43 localhost python3.9[113455]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:44 localhost python3.9[113547]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:44 localhost python3.9[113639]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:45 localhost python3.9[113731]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52426 DF PROTO=TCP SPT=42758 DPT=9102 SEQ=1735802678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59842B680000000001030307) Feb 20 04:03:46 localhost python3.9[113823]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:46 localhost python3.9[113915]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:47 localhost python3.9[114007]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15244 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598431FA0000000001030307) Feb 20 04:03:47 localhost python3.9[114099]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:48 localhost python3.9[114191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:48 localhost python3.9[114283]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:49 localhost python3.9[114375]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:50 localhost python3.9[114467]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:50 localhost python3.9[114559]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15246 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59843DE90000000001030307) Feb 20 04:03:51 localhost python3.9[114651]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:51 localhost python3.9[114743]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:52 localhost python3.9[114835]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:53 localhost python3.9[114927]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:53 localhost python3.9[115019]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:54 localhost python3.9[115111]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:54 localhost auditd[725]: Audit daemon rotating log files Feb 20 04:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15247 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59844DA90000000001030307) Feb 20 04:03:54 localhost python3.9[115203]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:55 localhost python3.9[115295]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:56 localhost python3.9[115387]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:56 localhost python3.9[115479]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:57 localhost python3.9[115571]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33357 DF PROTO=TCP SPT=41010 DPT=9105 SEQ=1801412553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598458A90000000001030307) Feb 20 04:03:57 localhost python3.9[115663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:58 localhost python3.9[115785]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:58 localhost systemd[1]: tmp-crun.DPTcTv.mount: Deactivated successfully. Feb 20 04:03:58 localhost podman[115906]: 2026-02-20 09:03:58.764173752 +0000 UTC m=+0.103862472 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Feb 20 04:03:58 localhost podman[115906]: 2026-02-20 09:03:58.865045801 +0000 UTC m=+0.204734511 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, RELEASE=main, distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, release=1770267347, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:03:59 localhost python3.9[115970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:03:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33358 DF PROTO=TCP SPT=41010 DPT=9105 SEQ=1801412553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598460A90000000001030307) Feb 20 04:03:59 localhost python3.9[116142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:00 localhost python3.9[116266]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:00 localhost python3.9[116373]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:01 localhost python3.9[116465]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:01 localhost python3.9[116557]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:02 localhost python3.9[116649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15248 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59846D680000000001030307) Feb 20 04:04:03 localhost python3.9[116741]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:03 localhost python3.9[116833]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:04 localhost python3.9[116925]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:04 localhost python3.9[117017]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:05 localhost python3.9[117109]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:06 localhost python3.9[117201]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13768 DF PROTO=TCP SPT=51760 DPT=9101 SEQ=2817866390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59847AA50000000001030307) Feb 20 04:04:06 localhost python3.9[117293]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:07 localhost python3.9[117385]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:08 localhost python3.9[117477]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13770 DF PROTO=TCP SPT=51760 DPT=9101 SEQ=2817866390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598486A90000000001030307) Feb 20 04:04:10 localhost python3.9[117569]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:04:10 localhost systemd[1]: Reloading. Feb 20 04:04:10 localhost systemd-rc-local-generator[117596]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:04:10 localhost systemd-sysv-generator[117599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:04:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:04:11 localhost python3.9[117696]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33360 DF PROTO=TCP SPT=41010 DPT=9105 SEQ=1801412553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598491680000000001030307) Feb 20 04:04:12 localhost python3.9[117789]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:12 localhost python3.9[117882]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:13 localhost python3.9[117975]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:14 localhost python3.9[118068]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:14 localhost python3.9[118161]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:15 localhost python3.9[118254]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:16 localhost python3.9[118347]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16550 DF PROTO=TCP SPT=45740 DPT=9100 SEQ=2847017574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984A1680000000001030307) Feb 20 04:04:16 localhost python3.9[118440]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:17 localhost python3.9[118533]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49483 DF PROTO=TCP SPT=46936 DPT=9882 SEQ=3980037504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984A72A0000000001030307) Feb 20 04:04:17 localhost python3.9[118626]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:18 localhost python3.9[118719]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:19 localhost python3.9[118812]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:19 localhost python3.9[118905]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:20 localhost python3.9[118998]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49485 DF PROTO=TCP SPT=46936 DPT=9882 SEQ=3980037504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984B3290000000001030307) Feb 20 04:04:21 localhost python3.9[119091]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:21 localhost python3.9[119184]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:22 localhost python3.9[119277]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:23 localhost python3.9[119370]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:24 localhost python3.9[119463]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49486 DF PROTO=TCP SPT=46936 DPT=9882 SEQ=3980037504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984C2E80000000001030307) Feb 20 04:04:25 localhost python3.9[119556]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:26 localhost systemd[1]: session-38.scope: Deactivated successfully. Feb 20 04:04:26 localhost systemd[1]: session-38.scope: Consumed 48.546s CPU time. Feb 20 04:04:26 localhost systemd-logind[759]: Session 38 logged out. Waiting for processes to exit. Feb 20 04:04:26 localhost systemd-logind[759]: Removed session 38. Feb 20 04:04:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61815 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=4136402762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984CDE80000000001030307) Feb 20 04:04:28 localhost sshd[119572]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:04:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61816 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=4136402762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984D5E90000000001030307) Feb 20 04:04:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24810 DF PROTO=TCP SPT=44300 DPT=9102 SEQ=980417637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984E1680000000001030307) Feb 20 04:04:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59185 DF PROTO=TCP SPT=37732 DPT=9101 SEQ=15335578 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984EFD50000000001030307) Feb 20 04:04:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59187 DF PROTO=TCP SPT=37732 DPT=9101 SEQ=15335578 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984FBE90000000001030307) Feb 20 04:04:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61818 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=4136402762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598505680000000001030307) Feb 20 04:04:44 localhost sshd[119574]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:04:44 localhost systemd-logind[759]: New session 39 of user zuul. Feb 20 04:04:44 localhost systemd[1]: Started Session 39 of User zuul. Feb 20 04:04:45 localhost python3.9[119667]: ansible-ansible.legacy.ping Invoked with data=pong Feb 20 04:04:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49466 DF PROTO=TCP SPT=35674 DPT=9100 SEQ=1302452893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598515680000000001030307) Feb 20 04:04:46 localhost python3.9[119771]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:04:47 localhost python3.9[119863]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27068 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59851C5A0000000001030307) Feb 20 04:04:48 localhost python3.9[119956]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:04:49 localhost python3.9[120048]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:49 localhost python3.9[120140]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:04:50 localhost python3.9[120213]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578289.2424505-175-50825165301189/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27070 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598528680000000001030307) Feb 20 04:04:51 localhost python3.9[120305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:04:52 localhost python3.9[120401]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:04:53 localhost python3.9[120493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:04:53 localhost python3.9[120583]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:04:53 localhost network[120600]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:04:53 localhost network[120601]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:04:53 localhost network[120602]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:04:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27071 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598538280000000001030307) Feb 20 04:04:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:04:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42826 DF PROTO=TCP SPT=49320 DPT=9105 SEQ=1002855727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59853F070000000001030307) Feb 20 04:04:57 localhost python3.9[120799]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:04:58 localhost python3.9[120889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:04:58 localhost sshd[120894]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:04:59 localhost python3.9[120987]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:04:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42828 DF PROTO=TCP SPT=49320 DPT=9105 SEQ=1002855727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59854B280000000001030307) Feb 20 04:05:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27072 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598559680000000001030307) Feb 20 04:05:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62550 DF PROTO=TCP SPT=43976 DPT=9101 SEQ=160469416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598565050000000001030307) Feb 20 04:05:09 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 20 04:05:09 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 20 04:05:09 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 20 04:05:09 localhost systemd[1]: sshd.service: Consumed 27.099s CPU time. Feb 20 04:05:09 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 20 04:05:09 localhost systemd[1]: Stopping sshd-keygen.target... Feb 20 04:05:09 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:05:09 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:05:09 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:05:09 localhost systemd[1]: Reached target sshd-keygen.target. Feb 20 04:05:09 localhost systemd[1]: Starting OpenSSH server daemon... Feb 20 04:05:09 localhost sshd[121107]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:09 localhost systemd[1]: Started OpenSSH server daemon. Feb 20 04:05:09 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:05:09 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 04:05:09 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:05:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62552 DF PROTO=TCP SPT=43976 DPT=9101 SEQ=160469416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598571280000000001030307) Feb 20 04:05:09 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 04:05:09 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 04:05:09 localhost systemd[1]: run-rbbb8798f96e34674bbaf104827d20089.service: Deactivated successfully. Feb 20 04:05:09 localhost systemd[1]: run-r5986b363c93241878ab93cc953588cd2.service: Deactivated successfully. Feb 20 04:05:10 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 20 04:05:10 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 20 04:05:10 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 20 04:05:10 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 20 04:05:10 localhost systemd[1]: Stopping sshd-keygen.target... Feb 20 04:05:10 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:05:10 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:05:10 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:05:10 localhost systemd[1]: Reached target sshd-keygen.target. Feb 20 04:05:10 localhost systemd[1]: Starting OpenSSH server daemon... Feb 20 04:05:10 localhost sshd[121278]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:10 localhost systemd[1]: Started OpenSSH server daemon. Feb 20 04:05:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42830 DF PROTO=TCP SPT=49320 DPT=9105 SEQ=1002855727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59857B680000000001030307) Feb 20 04:05:14 localhost sshd[121284]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55633 DF PROTO=TCP SPT=42926 DPT=9100 SEQ=775551466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59858B680000000001030307) Feb 20 04:05:16 localhost sshd[121287]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1739 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985918A0000000001030307) Feb 20 04:05:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1741 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59859DA80000000001030307) Feb 20 04:05:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1742 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985AD680000000001030307) Feb 20 04:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48393 DF PROTO=TCP SPT=53078 DPT=9105 SEQ=2937509130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985B8280000000001030307) Feb 20 04:05:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48394 DF PROTO=TCP SPT=53078 DPT=9105 SEQ=2937509130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985C0290000000001030307) Feb 20 04:05:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1743 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985CD680000000001030307) Feb 20 04:05:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56670 DF PROTO=TCP SPT=44272 DPT=9101 SEQ=1716440593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985DA340000000001030307) Feb 20 04:05:36 localhost sshd[121408]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56672 DF PROTO=TCP SPT=44272 DPT=9101 SEQ=1716440593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985E6280000000001030307) Feb 20 04:05:41 localhost sshd[121423]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48396 DF PROTO=TCP SPT=53078 DPT=9105 SEQ=2937509130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985EF680000000001030307) Feb 20 04:05:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10518 DF PROTO=TCP SPT=57060 DPT=9102 SEQ=1822393694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985FF680000000001030307) Feb 20 04:05:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25107 DF PROTO=TCP SPT=42584 DPT=9882 SEQ=4293515916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598606BA0000000001030307) Feb 20 04:05:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25109 DF PROTO=TCP SPT=42584 DPT=9882 SEQ=4293515916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598612A90000000001030307) Feb 20 04:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25110 DF PROTO=TCP SPT=42584 DPT=9882 SEQ=4293515916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598622680000000001030307) Feb 20 04:05:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20179 DF PROTO=TCP SPT=34736 DPT=9105 SEQ=1894680188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59862D690000000001030307) Feb 20 04:05:59 localhost sshd[121597]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:05:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20180 DF PROTO=TCP SPT=34736 DPT=9105 SEQ=1894680188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598635690000000001030307) Feb 20 04:06:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20390 DF PROTO=TCP SPT=57532 DPT=9100 SEQ=1464944850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598641690000000001030307) Feb 20 04:06:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47571 DF PROTO=TCP SPT=58468 DPT=9101 SEQ=1646320255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59864F640000000001030307) Feb 20 04:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47573 DF PROTO=TCP SPT=58468 DPT=9101 SEQ=1646320255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59865B680000000001030307) Feb 20 04:06:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20182 DF PROTO=TCP SPT=34736 DPT=9105 SEQ=1894680188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598665680000000001030307) Feb 20 04:06:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64594 DF PROTO=TCP SPT=48798 DPT=9100 SEQ=1833037813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598675680000000001030307) Feb 20 04:06:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=774 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59867BEA0000000001030307) Feb 20 04:06:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=776 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598687E90000000001030307) Feb 20 04:06:21 localhost kernel: SELinux: Converting 2754 SID table entries... Feb 20 04:06:21 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:06:21 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:06:21 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:06:21 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:06:21 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:06:21 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:06:21 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:06:23 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=17 res=1 Feb 20 04:06:23 localhost python3.9[121935]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:06:24 localhost python3.9[122027]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:06:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=777 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598697A80000000001030307) Feb 20 04:06:24 localhost python3.9[122100]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578383.9552515-423-137880703546724/.source.fact _original_basename=.ss4t8t6i follow=False checksum=d686dccd4d8cd0883f3e3bc0a6f664c73290ba68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:06:25 localhost python3.9[122190]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:06:26 localhost python3.9[122288]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:06:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57769 DF PROTO=TCP SPT=46492 DPT=9105 SEQ=999787407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986A2A80000000001030307) Feb 20 04:06:27 localhost python3.9[122342]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:06:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57770 DF PROTO=TCP SPT=46492 DPT=9105 SEQ=999787407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986AAA80000000001030307) Feb 20 04:06:31 localhost systemd[1]: Reloading. Feb 20 04:06:31 localhost systemd-rc-local-generator[122379]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:06:31 localhost systemd-sysv-generator[122382]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:06:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:06:31 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 04:06:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=778 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986B7680000000001030307) Feb 20 04:06:33 localhost python3.9[122483]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:06:35 localhost python3.9[122722]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Feb 20 04:06:36 localhost python3.9[122814]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Feb 20 04:06:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21677 DF PROTO=TCP SPT=38876 DPT=9101 SEQ=3683062109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986C4950000000001030307) Feb 20 04:06:37 localhost python3.9[122907]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:06:38 localhost python3.9[122999]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Feb 20 04:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21679 DF PROTO=TCP SPT=38876 DPT=9101 SEQ=3683062109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986D0A80000000001030307) Feb 20 04:06:39 localhost python3.9[123091]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:06:40 localhost python3.9[123183]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:06:40 localhost python3.9[123256]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578399.774818-748-52794336280186/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:06:41 localhost python3.9[123348]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:06:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57772 DF PROTO=TCP SPT=46492 DPT=9105 SEQ=999787407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986DB680000000001030307) Feb 20 04:06:43 localhost python3.9[123442]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Feb 20 04:06:44 localhost sshd[123536]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:06:44 localhost python3.9[123535]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Feb 20 04:06:45 localhost python3.9[123630]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 20 04:06:45 localhost python3.9[123728]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Feb 20 04:06:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51570 DF PROTO=TCP SPT=38710 DPT=9100 SEQ=4250352083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986EB680000000001030307) Feb 20 04:06:46 localhost python3.9[123820]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:06:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34187 DF PROTO=TCP SPT=48446 DPT=9882 SEQ=991171327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986F11A0000000001030307) Feb 20 04:06:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34189 DF PROTO=TCP SPT=48446 DPT=9882 SEQ=991171327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986FD280000000001030307) Feb 20 04:06:51 localhost python3.9[123914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:06:52 localhost python3.9[124006]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:06:52 localhost python3.9[124079]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578411.7413259-1021-277731476419526/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:06:53 localhost python3.9[124171]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:06:53 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 20 04:06:53 localhost systemd[1]: Stopped Load Kernel Modules. Feb 20 04:06:53 localhost systemd[1]: Stopping Load Kernel Modules... Feb 20 04:06:53 localhost systemd[1]: Starting Load Kernel Modules... Feb 20 04:06:53 localhost systemd-modules-load[124175]: Module 'msr' is built in Feb 20 04:06:53 localhost systemd[1]: Finished Load Kernel Modules. Feb 20 04:06:54 localhost python3.9[124267]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:06:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34190 DF PROTO=TCP SPT=48446 DPT=9882 SEQ=991171327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59870CE80000000001030307) Feb 20 04:06:55 localhost python3.9[124340]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578414.026495-1090-122631989914202/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:06:56 localhost python3.9[124432]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:06:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15041 DF PROTO=TCP SPT=45392 DPT=9105 SEQ=1214227235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598717A90000000001030307) Feb 20 04:06:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15042 DF PROTO=TCP SPT=45392 DPT=9105 SEQ=1214227235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59871FA80000000001030307) Feb 20 04:07:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30894 DF PROTO=TCP SPT=52968 DPT=9102 SEQ=84378607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59872B680000000001030307) Feb 20 04:07:04 localhost python3.9[124555]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:07:05 localhost python3.9[124679]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Feb 20 04:07:05 localhost python3.9[124784]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:07:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27950 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=516693089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598739C50000000001030307) Feb 20 04:07:06 localhost python3.9[124876]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:07:07 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 20 04:07:07 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 20 04:07:07 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 20 04:07:07 localhost systemd[1]: tuned.service: Consumed 1.845s CPU time, no IO. Feb 20 04:07:07 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 20 04:07:09 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 20 04:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27952 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=516693089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598745E90000000001030307) Feb 20 04:07:09 localhost python3.9[124979]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Feb 20 04:07:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15044 DF PROTO=TCP SPT=45392 DPT=9105 SEQ=1214227235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59874F680000000001030307) Feb 20 04:07:13 localhost python3.9[125071]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:07:13 localhost systemd[1]: Reloading. Feb 20 04:07:13 localhost systemd-rc-local-generator[125100]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:07:13 localhost systemd-sysv-generator[125104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:07:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:07:14 localhost python3.9[125201]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:07:15 localhost systemd[1]: Reloading. Feb 20 04:07:15 localhost systemd-rc-local-generator[125227]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:07:15 localhost systemd-sysv-generator[125233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:07:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:07:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52830 DF PROTO=TCP SPT=57070 DPT=9100 SEQ=209070533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59875F680000000001030307) Feb 20 04:07:16 localhost python3.9[125331]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:07:17 localhost python3.9[125424]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:07:17 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Feb 20 04:07:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29473 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987664A0000000001030307) Feb 20 04:07:17 localhost python3.9[125517]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:07:19 localhost python3.9[125616]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:07:20 localhost python3.9[125709]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:07:20 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 20 04:07:20 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 20 04:07:20 localhost systemd[1]: Stopping Apply Kernel Variables... Feb 20 04:07:20 localhost systemd[1]: Starting Apply Kernel Variables... Feb 20 04:07:20 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 20 04:07:20 localhost systemd[1]: Finished Apply Kernel Variables. Feb 20 04:07:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29475 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598772680000000001030307) Feb 20 04:07:21 localhost systemd[1]: session-39.scope: Deactivated successfully. Feb 20 04:07:21 localhost systemd[1]: session-39.scope: Consumed 1min 59.695s CPU time. Feb 20 04:07:21 localhost systemd-logind[759]: Session 39 logged out. Waiting for processes to exit. Feb 20 04:07:21 localhost systemd-logind[759]: Removed session 39. Feb 20 04:07:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29476 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598782280000000001030307) Feb 20 04:07:26 localhost sshd[125729]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:07:26 localhost systemd-logind[759]: New session 40 of user zuul. Feb 20 04:07:26 localhost systemd[1]: Started Session 40 of User zuul. Feb 20 04:07:26 localhost sshd[125765]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:07:27 localhost python3.9[125824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:07:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48171 DF PROTO=TCP SPT=60372 DPT=9105 SEQ=1810654222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59878CE80000000001030307) Feb 20 04:07:28 localhost python3.9[125918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:07:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48172 DF PROTO=TCP SPT=60372 DPT=9105 SEQ=1810654222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598794E80000000001030307) Feb 20 04:07:30 localhost python3.9[126014]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:07:31 localhost python3.9[126105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:07:32 localhost python3.9[126201]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:07:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29477 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987A3690000000001030307) Feb 20 04:07:33 localhost python3.9[126255]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:07:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24755 DF PROTO=TCP SPT=50880 DPT=9101 SEQ=807687321 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987AEF50000000001030307) Feb 20 04:07:37 localhost python3.9[126349]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:07:39 localhost python3.9[126504]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:07:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24757 DF PROTO=TCP SPT=50880 DPT=9101 SEQ=807687321 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987BAE80000000001030307) Feb 20 04:07:39 localhost python3.9[126596]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:07:40 localhost python3.9[126700]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:07:41 localhost python3.9[126748]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:07:41 localhost python3.9[126840]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:07:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48174 DF PROTO=TCP SPT=60372 DPT=9105 SEQ=1810654222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987C5680000000001030307) Feb 20 04:07:42 localhost python3.9[126913]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578461.4064102-320-28069435415316/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:07:43 localhost python3.9[127005]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:07:43 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 20 04:07:43 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:07:43 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:07:43 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:07:43 localhost python3.9[127098]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:07:44 localhost python3.9[127190]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:07:45 localhost python3.9[127282]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:07:46 localhost python3.9[127372]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:07:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21097 DF PROTO=TCP SPT=34696 DPT=9102 SEQ=909875513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987D5680000000001030307) Feb 20 04:07:46 localhost python3.9[127466]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:07:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27106 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987DB7A0000000001030307) Feb 20 04:07:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27108 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987E7680000000001030307) Feb 20 04:07:50 localhost python3.9[127560]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:07:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27109 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987F7290000000001030307) Feb 20 04:07:55 localhost python3.9[127654]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25660 DF PROTO=TCP SPT=57916 DPT=9105 SEQ=3359371827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598802280000000001030307) Feb 20 04:07:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25661 DF PROTO=TCP SPT=57916 DPT=9105 SEQ=3359371827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59880A280000000001030307) Feb 20 04:07:59 localhost python3.9[127754]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:08:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27110 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598817680000000001030307) Feb 20 04:08:04 localhost python3.9[127848]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:08:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57223 DF PROTO=TCP SPT=39340 DPT=9101 SEQ=3203763100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598824250000000001030307) Feb 20 04:08:08 localhost python3.9[128004]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:08:09 localhost sshd[128007]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57225 DF PROTO=TCP SPT=39340 DPT=9101 SEQ=3203763100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598830280000000001030307) Feb 20 04:08:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25663 DF PROTO=TCP SPT=57916 DPT=9105 SEQ=3359371827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598839680000000001030307) Feb 20 04:08:12 localhost python3.9[128115]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:08:13 localhost sshd[128117]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:08:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10558 DF PROTO=TCP SPT=33098 DPT=9102 SEQ=2659812165 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598849680000000001030307) Feb 20 04:08:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5348 DF PROTO=TCP SPT=33244 DPT=9882 SEQ=2025381304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598850AA0000000001030307) Feb 20 04:08:20 localhost sshd[128130]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:08:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5350 DF PROTO=TCP SPT=33244 DPT=9882 SEQ=2025381304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59885CA80000000001030307) Feb 20 04:08:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5351 DF PROTO=TCP SPT=33244 DPT=9882 SEQ=2025381304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59886C680000000001030307) Feb 20 04:08:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34420 DF PROTO=TCP SPT=58272 DPT=9105 SEQ=1369561492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598877690000000001030307) Feb 20 04:08:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34421 DF PROTO=TCP SPT=58272 DPT=9105 SEQ=1369561492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59887F680000000001030307) Feb 20 04:08:32 localhost python3.9[128287]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:08:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14193 DF PROTO=TCP SPT=33326 DPT=9100 SEQ=2745313112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59888B690000000001030307) Feb 20 04:08:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33125 DF PROTO=TCP SPT=60058 DPT=9101 SEQ=1362994008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598899550000000001030307) Feb 20 04:08:36 localhost python3.9[128382]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:08:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33127 DF PROTO=TCP SPT=60058 DPT=9101 SEQ=1362994008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988A5680000000001030307) Feb 20 04:08:40 localhost python3.9[128479]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:08:41 localhost python3.9[128584]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:08:41 localhost python3.9[128657]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771578520.8020444-773-209109761705897/.source.json _original_basename=.cbwcwmdf follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:08:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34423 DF PROTO=TCP SPT=58272 DPT=9105 SEQ=1369561492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988AF680000000001030307) Feb 20 04:08:43 localhost python3.9[128749]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:08:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45207 DF PROTO=TCP SPT=54542 DPT=9102 SEQ=247792192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988BF680000000001030307) Feb 20 04:08:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41054 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988C5DA0000000001030307) Feb 20 04:08:49 localhost podman[128762]: 2026-02-20 09:08:43.466791935 +0000 UTC m=+0.048438438 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 20 04:08:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41056 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988D1E80000000001030307) Feb 20 04:08:50 localhost python3.9[128963]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:08:51 localhost sshd[128988]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:08:53 localhost sshd[128999]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:08:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41057 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988E1A80000000001030307) Feb 20 04:08:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39695 DF PROTO=TCP SPT=43888 DPT=9105 SEQ=1885777334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988ECA80000000001030307) Feb 20 04:08:58 localhost podman[128975]: 2026-02-20 09:08:51.094113643 +0000 UTC m=+0.044720713 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 20 04:08:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39696 DF PROTO=TCP SPT=43888 DPT=9105 SEQ=1885777334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988F4A80000000001030307) Feb 20 04:09:00 localhost python3.9[129180]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:09:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41058 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598901690000000001030307) Feb 20 04:09:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=892 DF PROTO=TCP SPT=38760 DPT=9101 SEQ=1623084561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59890E840000000001030307) Feb 20 04:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=894 DF PROTO=TCP SPT=38760 DPT=9101 SEQ=1623084561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59891AA80000000001030307) Feb 20 04:09:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39698 DF PROTO=TCP SPT=43888 DPT=9105 SEQ=1885777334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598925680000000001030307) Feb 20 04:09:12 localhost podman[129193]: 2026-02-20 09:09:00.185136516 +0000 UTC m=+0.045695875 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 20 04:09:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38454 DF PROTO=TCP SPT=55242 DPT=9102 SEQ=2644966875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598935680000000001030307) Feb 20 04:09:17 localhost python3.9[130006]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:09:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8363 DF PROTO=TCP SPT=56128 DPT=9882 SEQ=1571366773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59893B0B0000000001030307) Feb 20 04:09:19 localhost podman[130019]: 2026-02-20 09:09:17.525113536 +0000 UTC m=+0.031527596 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 20 04:09:20 localhost python3.9[130199]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:09:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8365 DF PROTO=TCP SPT=56128 DPT=9882 SEQ=1571366773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598947280000000001030307) Feb 20 04:09:21 localhost podman[130213]: 2026-02-20 09:09:20.343092889 +0000 UTC m=+0.048204071 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:09:22 localhost python3.9[130376]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:09:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8366 DF PROTO=TCP SPT=56128 DPT=9882 SEQ=1571366773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598956E90000000001030307) Feb 20 04:09:26 localhost podman[130388]: 2026-02-20 09:09:22.860440432 +0000 UTC m=+0.045870869 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Feb 20 04:09:27 localhost python3.9[130565]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 20 04:09:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40299 DF PROTO=TCP SPT=59724 DPT=9105 SEQ=2635809123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598961A80000000001030307) Feb 20 04:09:29 localhost podman[130577]: 2026-02-20 09:09:27.521759129 +0000 UTC m=+0.046061545 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Feb 20 04:09:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40300 DF PROTO=TCP SPT=59724 DPT=9105 SEQ=2635809123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598969A90000000001030307) Feb 20 04:09:32 localhost systemd[1]: session-40.scope: Deactivated successfully. Feb 20 04:09:32 localhost systemd[1]: session-40.scope: Consumed 2min 7.898s CPU time. Feb 20 04:09:32 localhost systemd-logind[759]: Session 40 logged out. Waiting for processes to exit. Feb 20 04:09:32 localhost systemd-logind[759]: Removed session 40. Feb 20 04:09:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45354 DF PROTO=TCP SPT=36634 DPT=9100 SEQ=2573324998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598975680000000001030307) Feb 20 04:09:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34568 DF PROTO=TCP SPT=42508 DPT=9101 SEQ=3887936909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598983B70000000001030307) Feb 20 04:09:37 localhost sshd[130686]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:09:37 localhost sshd[130688]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:09:37 localhost systemd-logind[759]: New session 41 of user zuul. Feb 20 04:09:37 localhost systemd[1]: Started Session 41 of User zuul. Feb 20 04:09:38 localhost python3.9[130781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34570 DF PROTO=TCP SPT=42508 DPT=9101 SEQ=3887936909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59898FA80000000001030307) Feb 20 04:09:39 localhost python3.9[130877]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Feb 20 04:09:41 localhost python3.9[130970]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:09:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40302 DF PROTO=TCP SPT=59724 DPT=9105 SEQ=2635809123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598999680000000001030307) Feb 20 04:09:42 localhost python3.9[131024]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:09:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63254 DF PROTO=TCP SPT=33380 DPT=9100 SEQ=2575278553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989A9680000000001030307) Feb 20 04:09:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:09:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:09:47 localhost python3.9[131118]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:09:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13417 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989B03B0000000001030307) Feb 20 04:09:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13419 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989BC280000000001030307) Feb 20 04:09:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:09:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:09:51 localhost python3.9[131212]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:09:52 localhost python3.9[131305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:09:53 localhost python3.9[131397]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Feb 20 04:09:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13420 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989CBE80000000001030307) Feb 20 04:09:55 localhost kernel: SELinux: Converting 2756 SID table entries... Feb 20 04:09:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:09:55 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:09:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:09:55 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:09:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:09:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:09:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:09:56 localhost python3.9[131809]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:09:57 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=18 res=1 Feb 20 04:09:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23474 DF PROTO=TCP SPT=33482 DPT=9105 SEQ=2218267951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989D6E90000000001030307) Feb 20 04:09:57 localhost python3.9[131907]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:09:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23475 DF PROTO=TCP SPT=33482 DPT=9105 SEQ=2218267951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989DEE80000000001030307) Feb 20 04:10:01 localhost python3.9[132001]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:10:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13421 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989EB680000000001030307) Feb 20 04:10:03 localhost python3.9[132246]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None Feb 20 04:10:04 localhost python3.9[132336]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:10:05 localhost python3.9[132430]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:10:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20041 DF PROTO=TCP SPT=48136 DPT=9101 SEQ=2280697435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989F8E50000000001030307) Feb 20 04:10:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20043 DF PROTO=TCP SPT=48136 DPT=9101 SEQ=2280697435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A04E90000000001030307) Feb 20 04:10:10 localhost python3.9[132524]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:10:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23477 DF PROTO=TCP SPT=33482 DPT=9105 SEQ=2218267951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A0F680000000001030307) Feb 20 04:10:14 localhost python3.9[132618]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 20 04:10:14 localhost systemd[1]: Reloading. Feb 20 04:10:15 localhost systemd-rc-local-generator[132650]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:10:15 localhost systemd-sysv-generator[132653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:10:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:10:16 localhost python3.9[132750]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:10:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44494 DF PROTO=TCP SPT=48560 DPT=9102 SEQ=3346129564 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A1F680000000001030307) Feb 20 04:10:16 localhost python3.9[132842]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:17 localhost python3.9[132936]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3305 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A256A0000000001030307) Feb 20 04:10:18 localhost python3.9[133059]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:18 localhost python3.9[133211]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:10:19 localhost podman[133293]: Feb 20 04:10:19 localhost podman[133293]: 2026-02-20 09:10:19.384350869 +0000 UTC m=+0.081585193 container create 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:10:19 localhost systemd[1]: Started libpod-conmon-31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e.scope. Feb 20 04:10:19 localhost podman[133293]: 2026-02-20 09:10:19.35431234 +0000 UTC m=+0.051546694 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:10:19 localhost systemd[1]: Started libcrun container. Feb 20 04:10:19 localhost podman[133293]: 2026-02-20 09:10:19.478843421 +0000 UTC m=+0.176077725 container init 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:10:19 localhost systemd[1]: tmp-crun.FK723U.mount: Deactivated successfully. Feb 20 04:10:19 localhost podman[133293]: 2026-02-20 09:10:19.492956977 +0000 UTC m=+0.190191291 container start 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, version=7, build-date=2026-02-09T10:25:24Z, ceph=True) Feb 20 04:10:19 localhost podman[133293]: 2026-02-20 09:10:19.493325968 +0000 UTC m=+0.190560322 container attach 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , ceph=True, vcs-type=git, release=1770267347, RELEASE=main, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:10:19 localhost upbeat_bouman[133341]: 167 167 Feb 20 04:10:19 localhost systemd[1]: libpod-31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e.scope: Deactivated successfully. Feb 20 04:10:19 localhost podman[133293]: 2026-02-20 09:10:19.49856067 +0000 UTC m=+0.195795004 container died 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, io.openshift.tags=rhceph ceph, release=1770267347, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, version=7, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:10:19 localhost podman[133346]: 2026-02-20 09:10:19.596746266 +0000 UTC m=+0.085977940 container remove 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, version=7, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:10:19 localhost systemd[1]: libpod-conmon-31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e.scope: Deactivated successfully. Feb 20 04:10:19 localhost python3.9[133340]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578618.5026045-566-147557596304841/.source _original_basename=.o399jm0v follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:19 localhost podman[133381]: Feb 20 04:10:19 localhost podman[133381]: 2026-02-20 09:10:19.830579325 +0000 UTC m=+0.077344022 container create 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.buildah.version=1.42.2, GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc.) Feb 20 04:10:19 localhost systemd[1]: Started libpod-conmon-99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941.scope. Feb 20 04:10:19 localhost systemd[1]: Started libcrun container. Feb 20 04:10:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 04:10:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:10:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 04:10:19 localhost podman[133381]: 2026-02-20 09:10:19.799514825 +0000 UTC m=+0.046279532 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:10:19 localhost podman[133381]: 2026-02-20 09:10:19.90186471 +0000 UTC m=+0.148629417 container init 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, ceph=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64) Feb 20 04:10:19 localhost podman[133381]: 2026-02-20 09:10:19.911461406 +0000 UTC m=+0.158226113 container start 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, maintainer=Guillaume Abrioux , ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2) Feb 20 04:10:19 localhost podman[133381]: 2026-02-20 09:10:19.911733814 +0000 UTC m=+0.158498561 container attach 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:10:20 localhost python3.9[133480]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:20 localhost systemd[1]: var-lib-containers-storage-overlay-62dd6aaec95394fbd7fb576113f27cc341e7316b5770e2202ab8860a7447ff48-merged.mount: Deactivated successfully. Feb 20 04:10:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3307 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A31680000000001030307) Feb 20 04:10:20 localhost festive_jang[133423]: [ Feb 20 04:10:20 localhost festive_jang[133423]: { Feb 20 04:10:20 localhost festive_jang[133423]: "available": false, Feb 20 04:10:20 localhost festive_jang[133423]: "ceph_device": false, Feb 20 04:10:20 localhost festive_jang[133423]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 20 04:10:20 localhost festive_jang[133423]: "lsm_data": {}, Feb 20 04:10:20 localhost festive_jang[133423]: "lvs": [], Feb 20 04:10:20 localhost festive_jang[133423]: "path": "/dev/sr0", Feb 20 04:10:20 localhost festive_jang[133423]: "rejected_reasons": [ Feb 20 04:10:20 localhost festive_jang[133423]: "Has a FileSystem", Feb 20 04:10:20 localhost festive_jang[133423]: "Insufficient space (<5GB)" Feb 20 04:10:20 localhost festive_jang[133423]: ], Feb 20 04:10:20 localhost festive_jang[133423]: "sys_api": { Feb 20 04:10:20 localhost festive_jang[133423]: "actuators": null, Feb 20 04:10:20 localhost festive_jang[133423]: "device_nodes": "sr0", Feb 20 04:10:20 localhost festive_jang[133423]: "human_readable_size": "482.00 KB", Feb 20 04:10:20 localhost festive_jang[133423]: "id_bus": "ata", Feb 20 04:10:20 localhost festive_jang[133423]: "model": "QEMU DVD-ROM", Feb 20 04:10:20 localhost festive_jang[133423]: "nr_requests": "2", Feb 20 04:10:20 localhost festive_jang[133423]: "partitions": {}, Feb 20 04:10:20 localhost festive_jang[133423]: "path": "/dev/sr0", Feb 20 04:10:20 localhost festive_jang[133423]: "removable": "1", Feb 20 04:10:20 localhost festive_jang[133423]: "rev": "2.5+", Feb 20 04:10:20 localhost festive_jang[133423]: "ro": "0", Feb 20 04:10:20 localhost festive_jang[133423]: "rotational": "1", Feb 20 04:10:20 localhost festive_jang[133423]: "sas_address": "", Feb 20 04:10:20 localhost festive_jang[133423]: "sas_device_handle": "", Feb 20 04:10:20 localhost festive_jang[133423]: "scheduler_mode": "mq-deadline", Feb 20 04:10:20 localhost festive_jang[133423]: "sectors": 0, Feb 20 04:10:20 localhost festive_jang[133423]: "sectorsize": "2048", Feb 20 04:10:20 localhost festive_jang[133423]: "size": 493568.0, Feb 20 04:10:20 localhost festive_jang[133423]: "support_discard": "0", Feb 20 04:10:20 localhost festive_jang[133423]: "type": "disk", Feb 20 04:10:20 localhost festive_jang[133423]: "vendor": "QEMU" Feb 20 04:10:20 localhost festive_jang[133423]: } Feb 20 04:10:20 localhost festive_jang[133423]: } Feb 20 04:10:20 localhost festive_jang[133423]: ] Feb 20 04:10:20 localhost systemd[1]: libpod-99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941.scope: Deactivated successfully. Feb 20 04:10:20 localhost podman[133381]: 2026-02-20 09:10:20.81658993 +0000 UTC m=+1.063354657 container died 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7) Feb 20 04:10:20 localhost systemd[1]: var-lib-containers-storage-overlay-d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1-merged.mount: Deactivated successfully. Feb 20 04:10:20 localhost podman[135103]: 2026-02-20 09:10:20.934968441 +0000 UTC m=+0.102037237 container remove 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=1770267347, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:10:20 localhost systemd[1]: libpod-conmon-99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941.scope: Deactivated successfully. Feb 20 04:10:21 localhost python3.9[135149]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Feb 20 04:10:21 localhost python3.9[135256]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:22 localhost python3.9[135348]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:10:22 localhost sshd[135349]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:10:23 localhost python3.9[135423]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578622.3412344-692-239749894350298/.source.yaml _original_basename=.wwkg5hhx follow=False checksum=4c28d1662755c608a6ffaa942e27a2488c0a78a3 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:24 localhost python3.9[135515]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Feb 20 04:10:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3308 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A41280000000001030307) Feb 20 04:10:25 localhost ansible-async_wrapper.py[135620]: Invoked with j558637067917 300 /home/zuul/.ansible/tmp/ansible-tmp-1771578624.5583212-764-118499776117482/AnsiballZ_edpm_os_net_config.py _ Feb 20 04:10:25 localhost ansible-async_wrapper.py[135623]: Starting module and watcher Feb 20 04:10:25 localhost ansible-async_wrapper.py[135623]: Start watching 135624 (300) Feb 20 04:10:25 localhost ansible-async_wrapper.py[135624]: Start module (135624) Feb 20 04:10:25 localhost ansible-async_wrapper.py[135620]: Return async_wrapper task started. Feb 20 04:10:25 localhost python3.9[135625]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=False purge_provider= Feb 20 04:10:26 localhost ansible-async_wrapper.py[135624]: Module complete (135624) Feb 20 04:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45410 DF PROTO=TCP SPT=47560 DPT=9105 SEQ=2513380928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A4C290000000001030307) Feb 20 04:10:29 localhost python3.9[135729]: ansible-ansible.legacy.async_status Invoked with jid=j558637067917.135620 mode=status _async_dir=/root/.ansible_async Feb 20 04:10:29 localhost python3.9[135788]: ansible-ansible.legacy.async_status Invoked with jid=j558637067917.135620 mode=cleanup _async_dir=/root/.ansible_async Feb 20 04:10:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45411 DF PROTO=TCP SPT=47560 DPT=9105 SEQ=2513380928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A54280000000001030307) Feb 20 04:10:30 localhost ansible-async_wrapper.py[135623]: Done in kid B. Feb 20 04:10:30 localhost python3.9[135880]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:10:31 localhost python3.9[135953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578630.4778092-830-267898728335266/.source.returncode _original_basename=.05itn0s6 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:32 localhost python3.9[136045]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:10:32 localhost python3.9[136118]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578631.7530682-878-121067531517744/.source.cfg _original_basename=.k9psz1h3 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:10:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3309 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A61680000000001030307) Feb 20 04:10:33 localhost python3.9[136210]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:10:33 localhost systemd[1]: Reloading Network Manager... Feb 20 04:10:33 localhost NetworkManager[5988]: [1771578633.6925] audit: op="reload" arg="0" pid=136214 uid=0 result="success" Feb 20 04:10:33 localhost NetworkManager[5988]: [1771578633.6934] config: signal: SIGHUP (no changes from disk) Feb 20 04:10:33 localhost systemd[1]: Reloaded Network Manager. Feb 20 04:10:34 localhost systemd-logind[759]: Session 41 logged out. Waiting for processes to exit. Feb 20 04:10:34 localhost systemd[1]: session-41.scope: Deactivated successfully. Feb 20 04:10:34 localhost systemd[1]: session-41.scope: Consumed 36.534s CPU time. Feb 20 04:10:34 localhost systemd-logind[759]: Removed session 41. Feb 20 04:10:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8425 DF PROTO=TCP SPT=44590 DPT=9101 SEQ=3992755531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A6E150000000001030307) Feb 20 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8427 DF PROTO=TCP SPT=44590 DPT=9101 SEQ=3992755531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A7A280000000001030307) Feb 20 04:10:39 localhost sshd[136229]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:10:39 localhost systemd-logind[759]: New session 42 of user zuul. Feb 20 04:10:39 localhost systemd[1]: Started Session 42 of User zuul. Feb 20 04:10:40 localhost python3.9[136322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:10:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45413 DF PROTO=TCP SPT=47560 DPT=9105 SEQ=2513380928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A83680000000001030307) Feb 20 04:10:43 localhost python3.9[136416]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:10:44 localhost python3.9[136569]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:10:45 localhost systemd[1]: session-42.scope: Deactivated successfully. Feb 20 04:10:45 localhost systemd[1]: session-42.scope: Consumed 2.069s CPU time. Feb 20 04:10:45 localhost systemd-logind[759]: Session 42 logged out. Waiting for processes to exit. Feb 20 04:10:45 localhost systemd-logind[759]: Removed session 42. Feb 20 04:10:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33727 DF PROTO=TCP SPT=52134 DPT=9102 SEQ=3206208422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A93690000000001030307) Feb 20 04:10:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44480 DF PROTO=TCP SPT=37062 DPT=9882 SEQ=1333836637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A9A9A0000000001030307) Feb 20 04:10:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44482 DF PROTO=TCP SPT=37062 DPT=9882 SEQ=1333836637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AA6A80000000001030307) Feb 20 04:10:50 localhost sshd[136585]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:10:51 localhost sshd[136587]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:10:51 localhost systemd-logind[759]: New session 43 of user zuul. Feb 20 04:10:51 localhost systemd[1]: Started Session 43 of User zuul. Feb 20 04:10:52 localhost python3.9[136680]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:10:53 localhost python3.9[136774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:10:54 localhost python3.9[136870]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:10:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44483 DF PROTO=TCP SPT=37062 DPT=9882 SEQ=1333836637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AB6680000000001030307) Feb 20 04:10:55 localhost python3.9[136924]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:10:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59712 DF PROTO=TCP SPT=49296 DPT=9105 SEQ=2544659311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AC1680000000001030307) Feb 20 04:10:59 localhost python3.9[137018]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:10:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59713 DF PROTO=TCP SPT=49296 DPT=9105 SEQ=2544659311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AC9690000000001030307) Feb 20 04:11:00 localhost python3.9[137173]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:01 localhost python3.9[137265]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:11:02 localhost python3.9[137369]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51232 DF PROTO=TCP SPT=41524 DPT=9100 SEQ=138677609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AD5680000000001030307) Feb 20 04:11:03 localhost python3.9[137417]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:03 localhost python3.9[137509]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:04 localhost python3.9[137557]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:05 localhost python3.9[137649]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43856 DF PROTO=TCP SPT=40846 DPT=9101 SEQ=1695473391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AE3450000000001030307) Feb 20 04:11:06 localhost python3.9[137741]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:06 localhost python3.9[137833]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:07 localhost python3.9[137925]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:08 localhost python3.9[138017]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:11:08 localhost sshd[138020]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:11:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43858 DF PROTO=TCP SPT=40846 DPT=9101 SEQ=1695473391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AEF680000000001030307) Feb 20 04:11:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59715 DF PROTO=TCP SPT=49296 DPT=9105 SEQ=2544659311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AF9680000000001030307) Feb 20 04:11:12 localhost python3.9[138113]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:11:13 localhost python3.9[138207]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:11:14 localhost python3.9[138299]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:11:15 localhost python3.9[138391]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:11:16 localhost python3.9[138484]: ansible-service_facts Invoked Feb 20 04:11:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46028 DF PROTO=TCP SPT=47372 DPT=9100 SEQ=2546420612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B09680000000001030307) Feb 20 04:11:17 localhost network[138501]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:11:17 localhost network[138502]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:11:17 localhost network[138503]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:11:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45950 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B0FCA0000000001030307) Feb 20 04:11:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:11:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45952 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B1BE90000000001030307) Feb 20 04:11:22 localhost python3.9[138887]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:11:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45953 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B2BA90000000001030307) Feb 20 04:11:27 localhost python3.9[138996]: ansible-package_facts Invoked with manager=['auto'] strategy=first Feb 20 04:11:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34525 DF PROTO=TCP SPT=51776 DPT=9105 SEQ=3904820507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B36680000000001030307) Feb 20 04:11:28 localhost python3.9[139088]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:29 localhost python3.9[139163]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578688.2537355-654-205181155832597/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34526 DF PROTO=TCP SPT=51776 DPT=9105 SEQ=3904820507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B3E680000000001030307) Feb 20 04:11:30 localhost python3.9[139257]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:30 localhost sshd[139333]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:11:30 localhost python3.9[139332]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578689.7892437-701-261540160807018/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:32 localhost python3.9[139428]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45954 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B4B680000000001030307) Feb 20 04:11:34 localhost python3.9[139522]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:11:35 localhost python3.9[139576]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:11:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3021 DF PROTO=TCP SPT=51478 DPT=9101 SEQ=329870299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B58740000000001030307) Feb 20 04:11:37 localhost python3.9[139670]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:11:38 localhost python3.9[139724]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:11:38 localhost chronyd[26351]: chronyd exiting Feb 20 04:11:38 localhost systemd[1]: Stopping NTP client/server... Feb 20 04:11:38 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 20 04:11:38 localhost systemd[1]: Stopped NTP client/server. Feb 20 04:11:38 localhost systemd[1]: Starting NTP client/server... Feb 20 04:11:38 localhost chronyd[139732]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 20 04:11:38 localhost chronyd[139732]: Frequency -30.463 +/- 0.395 ppm read from /var/lib/chrony/drift Feb 20 04:11:38 localhost chronyd[139732]: Loaded seccomp filter (level 2) Feb 20 04:11:38 localhost systemd[1]: Started NTP client/server. Feb 20 04:11:38 localhost systemd[1]: session-43.scope: Deactivated successfully. Feb 20 04:11:38 localhost systemd[1]: session-43.scope: Consumed 28.846s CPU time. Feb 20 04:11:38 localhost systemd-logind[759]: Session 43 logged out. Waiting for processes to exit. Feb 20 04:11:38 localhost systemd-logind[759]: Removed session 43. Feb 20 04:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3023 DF PROTO=TCP SPT=51478 DPT=9101 SEQ=329870299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B64680000000001030307) Feb 20 04:11:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34528 DF PROTO=TCP SPT=51776 DPT=9105 SEQ=3904820507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B6F680000000001030307) Feb 20 04:11:44 localhost sshd[139748]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:11:44 localhost systemd-logind[759]: New session 44 of user zuul. Feb 20 04:11:44 localhost systemd[1]: Started Session 44 of User zuul. Feb 20 04:11:45 localhost python3.9[139841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:11:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63451 DF PROTO=TCP SPT=35400 DPT=9100 SEQ=1136106703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B7F680000000001030307) Feb 20 04:11:46 localhost python3.9[139937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:47 localhost python3.9[140042]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22224 DF PROTO=TCP SPT=46684 DPT=9882 SEQ=2265295752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B84FA0000000001030307) Feb 20 04:11:48 localhost python3.9[140090]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.fum15ngs recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:48 localhost python3.9[140182]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:49 localhost python3.9[140257]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578708.4834752-141-148363443541352/.source _original_basename=.7mi9d_a3 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:50 localhost python3.9[140349]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22226 DF PROTO=TCP SPT=46684 DPT=9882 SEQ=2265295752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B90E80000000001030307) Feb 20 04:11:51 localhost python3.9[140441]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:51 localhost python3.9[140514]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578710.5190637-212-13051073380919/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:52 localhost python3.9[140606]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:52 localhost python3.9[140679]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578711.7641823-212-183008077849754/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:11:53 localhost python3.9[140771]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:54 localhost python3.9[140863]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:54 localhost sshd[140885]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:11:54 localhost python3.9[140938]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578713.5401254-323-55671354908258/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22227 DF PROTO=TCP SPT=46684 DPT=9882 SEQ=2265295752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BA0A80000000001030307) Feb 20 04:11:55 localhost python3.9[141030]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:55 localhost python3.9[141103]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578714.7940207-369-253669147733664/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:56 localhost python3.9[141195]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:11:57 localhost systemd[1]: Reloading. Feb 20 04:11:57 localhost systemd-rc-local-generator[141223]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:11:57 localhost systemd-sysv-generator[141227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:11:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:11:57 localhost systemd[1]: Reloading. Feb 20 04:11:57 localhost systemd-rc-local-generator[141256]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:11:57 localhost systemd-sysv-generator[141259]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:11:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:11:57 localhost systemd[1]: Starting EDPM Container Shutdown... Feb 20 04:11:57 localhost systemd[1]: Finished EDPM Container Shutdown. Feb 20 04:11:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40835 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BABA80000000001030307) Feb 20 04:11:58 localhost python3.9[141364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:58 localhost python3.9[141437]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578717.8041644-437-212172924880594/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:11:59 localhost python3.9[141529]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:11:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40836 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BB3A90000000001030307) Feb 20 04:12:00 localhost python3.9[141602]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578719.0774527-483-209021327415134/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:00 localhost python3.9[141694]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:12:00 localhost systemd[1]: Reloading. Feb 20 04:12:00 localhost systemd-rc-local-generator[141717]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:12:00 localhost systemd-sysv-generator[141721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:12:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:12:02 localhost systemd[1]: Starting Create netns directory... Feb 20 04:12:02 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 04:12:02 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 04:12:02 localhost systemd[1]: Finished Create netns directory. Feb 20 04:12:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63452 DF PROTO=TCP SPT=35400 DPT=9100 SEQ=1136106703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BBF680000000001030307) Feb 20 04:12:02 localhost python3.9[141826]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:12:02 localhost network[141843]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:12:02 localhost network[141844]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:12:03 localhost network[141845]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:12:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:12:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24885 DF PROTO=TCP SPT=39744 DPT=9101 SEQ=1925110667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BCDA60000000001030307) Feb 20 04:12:08 localhost python3.9[142046]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:08 localhost python3.9[142121]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578727.584969-606-127817048547378/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24887 DF PROTO=TCP SPT=39744 DPT=9101 SEQ=1925110667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BD9A80000000001030307) Feb 20 04:12:09 localhost python3.9[142214]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:12:09 localhost systemd[1]: Reloading OpenSSH server daemon... Feb 20 04:12:09 localhost systemd[1]: Reloaded OpenSSH server daemon. Feb 20 04:12:09 localhost sshd[121278]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:12:10 localhost python3.9[142310]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:10 localhost python3.9[142402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:11 localhost python3.9[142475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578730.4907484-700-142983149238182/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40838 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BE3680000000001030307) Feb 20 04:12:12 localhost python3.9[142567]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Feb 20 04:12:12 localhost systemd[1]: Starting Time & Date Service... Feb 20 04:12:12 localhost systemd[1]: Started Time & Date Service. Feb 20 04:12:13 localhost python3.9[142663]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:13 localhost python3.9[142755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:14 localhost python3.9[142828]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578733.5220916-804-205492178996494/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:15 localhost python3.9[142920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:15 localhost python3.9[142993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578734.6781616-849-28773571617355/.source.yaml _original_basename=.gvis7vl3 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8635 DF PROTO=TCP SPT=48194 DPT=9100 SEQ=3024423624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BF3680000000001030307) Feb 20 04:12:16 localhost python3.9[143085]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:16 localhost python3.9[143160]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578735.9233341-894-237230867035291/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10460 DF PROTO=TCP SPT=42346 DPT=9882 SEQ=3856575038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BFA2A0000000001030307) Feb 20 04:12:17 localhost python3.9[143252]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:12:18 localhost python3.9[143345]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:12:19 localhost python3[143438]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 20 04:12:20 localhost python3.9[143530]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:21 localhost python3.9[143603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578739.996921-1011-264038146915696/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24889 DF PROTO=TCP SPT=39744 DPT=9101 SEQ=1925110667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C09690000000001030307) Feb 20 04:12:22 localhost python3.9[143695]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:22 localhost python3.9[143768]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578741.5345569-1056-280498703820595/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:23 localhost python3.9[143860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:23 localhost python3.9[143933]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578742.7351081-1101-165052859746299/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:24 localhost python3.9[144025]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:24 localhost python3.9[144098]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578743.8979042-1145-60352927874603/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:25 localhost python3.9[144190]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:26 localhost python3.9[144264]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578745.0928807-1191-2234488706410/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8736 DF PROTO=TCP SPT=44256 DPT=9105 SEQ=1291847517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C1CD00000000001030307) Feb 20 04:12:26 localhost python3.9[144435]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:27 localhost python3.9[144559]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40839 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C23680000000001030307) Feb 20 04:12:28 localhost python3.9[144654]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:29 localhost python3.9[144762]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:29 localhost python3.9[144854]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:30 localhost python3.9[144946]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Feb 20 04:12:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62174 DF PROTO=TCP SPT=37522 DPT=9102 SEQ=3832944715 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C2CB50000000001030307) Feb 20 04:12:31 localhost python3.9[145039]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Feb 20 04:12:31 localhost systemd[1]: session-44.scope: Deactivated successfully. Feb 20 04:12:31 localhost systemd[1]: session-44.scope: Consumed 28.602s CPU time. Feb 20 04:12:31 localhost systemd-logind[759]: Session 44 logged out. Waiting for processes to exit. Feb 20 04:12:31 localhost systemd-logind[759]: Removed session 44. Feb 20 04:12:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63453 DF PROTO=TCP SPT=35400 DPT=9100 SEQ=1136106703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C3D680000000001030307) Feb 20 04:12:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14532 DF PROTO=TCP SPT=48870 DPT=9101 SEQ=3105837729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C42D50000000001030307) Feb 20 04:12:36 localhost sshd[145055]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:12:37 localhost systemd-logind[759]: New session 45 of user zuul. Feb 20 04:12:37 localhost systemd[1]: Started Session 45 of User zuul. Feb 20 04:12:37 localhost python3.9[145150]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Feb 20 04:12:38 localhost sshd[145165]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:12:39 localhost python3.9[145244]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:12:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3027 DF PROTO=TCP SPT=51478 DPT=9101 SEQ=329870299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C53680000000001030307) Feb 20 04:12:40 localhost python3.9[145338]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Feb 20 04:12:41 localhost python3.9[145430]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.zwjwm229 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:12:42 localhost python3.9[145505]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.zwjwm229 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578761.4350631-192-223767579384344/.source.zwjwm229 _original_basename=.waklz5h8 follow=False checksum=831757da1f03f9732785943fa2a05c0d9424aa2f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:42 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 20 04:12:45 localhost python3.9[145599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:12:46 localhost python3.9[145691]: ansible-ansible.builtin.blockinfile Invoked with block=np0005625201.localdomain,192.168.122.105,np0005625201* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyGkX26ECIsvqnvJegedSF6KicDAAqjaifawEd//OuK9zdHIWqO3XmlEszZqWPsdQhPFkelfzXR+sy3gbPNv+yjT7phsw1sq7zHXeogQFlP5iOQZrf6hCnfXxVk2ckIXMT0UJVZ8FCTwsQi+HKkR/IEj08pR7EjrXGWxHkjv5wNj76spF3FJxtwycS4+KzY3UFy7gYWVn2jB0ha966YgjHMPhzQnT33W9myxGH33M1L5ZCGlfH19hLnqTUNMfzIfw3afxHkL5BFZbhthUPmIfLdLtKmZEkpSTBO/CrNA6CmMfY6xnT78hmwXytEQ+jeiRdKXdr9xQ2j6wVmPzckFKBsBYRe4DprKGt93fnKS9Z6A3Sv626DyZgDa8/NXbtAaBxtyix5Vdt872hYvCzYyB/OuSV6PR5DOq8z3fquOwgtka3rA6qL5gxhFJcO5TqtBM76DzOLd9OLM9bIO1yK9sCmbYynMojkXylzhDfcI8kytS5xs9FJEfwTElZRHkEIQE=#012np0005625201.localdomain,192.168.122.105,np0005625201* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINiFV2XLGVf9PGXF0NE4rbupw+vH23sDv10vB3wGrrmN#012np0005625201.localdomain,192.168.122.105,np0005625201* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM/mxytSzwSYcezRRSD4AjPi1j6Bxso/MLXC/NAewzvKThRznoUobc02vzGaO4FrwuZIZ/YHJyAHrQRbtdSPUTU=#012np0005625202.localdomain,192.168.122.106,np0005625202* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDr8sejencX7nSCX6AegGtTuiZL3yclu/L7ZVN4B6dKPdmHqVr33QJD40sEk28GHpx8BrkPU2Qj1de9H6mGtrlwhmJr7Pccg/YqzKoTCQD5rZQ4youU8H70As6YX5ZlXyulwI1SH70XjMm37x4ptKALFOjRnHg0WIXah/tAmzrY/orh+/eCcns7APVjN9B1o+MqP4r47WrWrGU/KxtsHc6dflWxZW7BWUCCNS0e3C4yWLRjy8Hhj7Qkpssv/UBcj+olVHadUUOYiaQZ5Y33MjxwIg8o1MuC7C1dNIn8eXOXXiA8jd/lJd9kImrCGUtkVqj8VQgsMh4vRYMD+0SNLYRDVwxdemOzJYgwQhgiWZ0G+cVhnTBpMmXyIws2OpOKU8R3HjTC3jz+BxvjwEvMDoQfpGgsHB9NCXnkQzs2F8EA8LpA823Ef1SMgPdDCaQzvN5oQPZkWAPMVHvq31xpN9q+KXg/bg0uDaIZXUxW2rGnem7pFS78rRUGL6MfSMn1zs=#012np0005625202.localdomain,192.168.122.106,np0005625202* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHIvGY3AHSeC6TXoQUOT+qZPpfcpbcCaqWpewY2PaUdr#012np0005625202.localdomain,192.168.122.106,np0005625202* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNhJMOoHTPuI+cufoglj5k5xopCSTjiletXnoJ15KnCBclkNCXy9DqMn/ZeknN3AqFVQZhJfknnRkCXvgtRg7lc=#012np0005625200.localdomain,192.168.122.104,np0005625200* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDW88346W6zU6nxCpqapHtIr5nRG8Jn9LFit3r5klBfauCkmAGONb4X8IwKjo8MD9etebUVbo6aX9gBMBMSs7bSoHzsEQuMLpBDrweSbahQj+gqZ5TmQ/xvwbhws04z3/IJxapAk2xWu7khVGjvOPUE1CROkP+1LiGktQ6Xj1ar1TbLNud2Dq/R5ZalbpK0OT3+no3x0oAJT3W649tW4nmCWcNaxykPsLREsUlH2qVoceAzLEDCSde9/1TONc/URyB4acVqmEwJDHeX51bh31tpQwp/WSe0vKQ6eUw63Tmpn+dRI9xbnFhc6mgGAPcEw7cAUkM7oM6bYMSvVxYDmzMhuXUU/9i3mdMnDBkMyZ5Oed6ZSmFQIJe5k7cz3783d35ZXfl/HsYMqoZ3lmDgbeS59pQrI+BldKyv3sTnoCDahfcmzmiHssxqa7tT5KOuR444q7Nj6wJEIZMEEJEHtMlh1iSBRJZOEOaKjo7h+jV7KMe75aPRasvu9K1v0dqyG6U=#012np0005625200.localdomain,192.168.122.104,np0005625200* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKZd4BJQ7FPHukFUlQ3fRSVsRqMpZA9FFzC98e6Nz+hC#012np0005625200.localdomain,192.168.122.104,np0005625200* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJgelHBDBResuC/7QDQA12qTpLPW1xHX6eUvY/QfQ0s1DYziYEKuSHQhUQMzxPcUq9IVVPnxkoRvZdWPxsh2Cmk=#012np0005625199.localdomain,192.168.122.103,np0005625199* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrnsozeOPJKYg9sx2Tj6QOLRhujK5RVh5RZQ3sb0pk+DbWHQKqS1YvJUg2hV4WxbxPnNUCBtJ+RZ8lVm6RLM+hc3ffe2sOMOz5upO/hTlIpBSfJpQORkiNW+XIXdDVxgE418veFd2hASFmiCmKoFSKXsvnmFU9oTEpja1plcXSqCobFMVYKlhcRo66O0ySlGOR+o3Ar2yNJQjFErEGvZLoDEa/VlA6zreYmTaIsnlUDie0gbm5teTlsCcEYkvWcTzcfOEX2kXQRQbS5qlPtGg7c+KMv5e40rE+2QOigLmOOPVGwNYuLuhb/EHT0C8hK8otW4tiXxBlSZ5ONKY6YYQOpy7krNkWRxNXzK0LfXo2bt2apDaMzebPOvuBj1YyBiLpa6/aLvS/dtGolQNPDpFivPbP/mSpat1qTs0W3/2HyBovwWSGJDW8MMYxbZJ0Z6tnuOwdrPTdkhIibfW9wxgL7EHrDYrGx5CvA2vUM4KDKRntz/cCMGE/zKacSJ48nNk=#012np0005625199.localdomain,192.168.122.103,np0005625199* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIENpQQgr9IVl8UWbQ9CANzH6ET+G2aHJkzVgu9ObE0o0#012np0005625199.localdomain,192.168.122.103,np0005625199* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJUcn4Y73wlRXKxRegM8lRt5GQ//hAORn8IqrcrC5ZJyjHCZmp+wutQeuPqPsTK4OVK+uH/93l/3Av8AKvpXG3A=#012np0005625203.localdomain,192.168.122.107,np0005625203* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtf1NXQ3EGQGdpLLLxuODKBdTGwqsiHL2QZ6zcfpGAa7EhDIxuEcLboqOGjQO0FM3u+kl2gIgKF0UsY5Vjcv4mDCMp7A7srq7TVo5lE5cCppbbXr0/PH2L/naHU3W+W83aT5RE17XPJ0Acn3W51WFBoICCCc4jjWTGmkNEgurKBJmdr0n8NeIcUWZ7Abrs/N2xzNftEFIjAPwebxgEwgCx0hMbdjTFhKbB/V7CjKaCU/UjirWMW5aDQJQEfrCM9u4NHuGaWKzJgar4/shNHaRvkCDbVrRPTCyfNebE04J/R42X3yWmvww4TMZVpRROd/u6Pgg1P2tbPGfQ0XvS0rfY6W4/VnHcyRDqxILH5BoeCAbTuVFmR0hbQu9fNbNxTP+o+na9mHEbNxbhcREnkal8+M0l11YftCRkr4132JITxe7y93gN/dwxE3nJLHLXRuRskWc3GTDT2MVU2Sj64yizD9KOM3oiMBXdPbNbgZywu3hqQvpO00GVg6QRjEJoiFc=#012np0005625203.localdomain,192.168.122.107,np0005625203* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPIEBJz4VBziYqCcr9UT9NnbvRxFLoAcnVJLavCpXqHm#012np0005625203.localdomain,192.168.122.107,np0005625203* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9k0T2/IFyFrBAAoi3QqwBKC9bi/bemQO6MNZhrO12MSG3WZcjS1bhOFPw5LuM+f11BFCm5wNyBNY/QmALZTgE=#012np0005625204.localdomain,192.168.122.108,np0005625204* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAo6exxFtNk/Y5qEGYenJyhnCsS7iZmCGsFaQtJElNSeTTX9a1P0P2EmjtHolRxnljCZ2X8HgWx/irhJvWLoS+dzF5l+KcyQy83+048h51mbnj7zV2uG9i8LkO0egs1uBBp5E+hauHMsuf0nIDFl45W86ZXuf+MfFEKCInhjB5gfE9tTjwmKwKhgO1DE7Vpx3OYy1FHkq0YDBCqQHuuhYPrLZPjfVv3vGOaHH/XCsxX3h8/ixsZbobD56dDBKF/8CFyC/guH8pNUhZHG0dEhz5BT8PcE2Q/M9pPttzmRQksfg9+q7lVy9eCoOVpzqfTgjE1cm5yISwuMZzaNxwjJKB54EWpfl5xxnkC14B+xdvowxpl1PcMNZ0q1fWofJF4TrJAwWCUYZf45aUV2yb5R8WavUT0pX32xmd4zFbXusoafiw2FcgnxoGz3N4ZgIxTPPmgUe13blr1SK44huXWPioaolFBo82xVVFHc+01vfLF3xvs86d6EpqpLH+yaCeUjE=#012np0005625204.localdomain,192.168.122.108,np0005625204* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDTY+/nqIDkr9+7jl3LUu4apuQeFzQYkXiSihEezHlEw#012np0005625204.localdomain,192.168.122.108,np0005625204* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPuq/q6JwPgXzS/TgJ6dhP0gZvq89Vk1r9Ou051lEnMdt+NHYUjJx2Tv1oS9A+wQXivor03/iqWU5nj5QHdvHx4=#012 create=True mode=0644 path=/tmp/ansible.zwjwm229 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:47 localhost python3.9[145783]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.zwjwm229' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:12:47 localhost sshd[145784]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:12:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28396 DF PROTO=TCP SPT=45726 DPT=9882 SEQ=3482117758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C6F5A0000000001030307) Feb 20 04:12:48 localhost python3.9[145879]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.zwjwm229 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:12:49 localhost systemd[1]: session-45.scope: Deactivated successfully. Feb 20 04:12:49 localhost systemd[1]: session-45.scope: Consumed 4.273s CPU time. Feb 20 04:12:49 localhost systemd-logind[759]: Session 45 logged out. Waiting for processes to exit. Feb 20 04:12:49 localhost systemd-logind[759]: Removed session 45. Feb 20 04:12:55 localhost sshd[145895]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:12:55 localhost systemd-logind[759]: New session 46 of user zuul. Feb 20 04:12:55 localhost systemd[1]: Started Session 46 of User zuul. Feb 20 04:12:56 localhost python3.9[145988]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:12:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17313 DF PROTO=TCP SPT=40530 DPT=9105 SEQ=2733664460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C92000000000001030307) Feb 20 04:12:57 localhost python3.9[146084]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 20 04:12:59 localhost python3.9[146178]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1414 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=3230471770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CA1EA0000000001030307) Feb 20 04:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20305 DF PROTO=TCP SPT=37448 DPT=9100 SEQ=293745995 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CA2400000000001030307) Feb 20 04:13:01 localhost python3.9[146271]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:13:02 localhost python3.9[146364]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:13:02 localhost python3.9[146458]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:13:03 localhost python3.9[146553]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:03 localhost systemd[1]: session-46.scope: Deactivated successfully. Feb 20 04:13:03 localhost systemd[1]: session-46.scope: Consumed 4.132s CPU time. Feb 20 04:13:03 localhost systemd-logind[759]: Session 46 logged out. Waiting for processes to exit. Feb 20 04:13:03 localhost systemd-logind[759]: Removed session 46. Feb 20 04:13:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26591 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CB8040000000001030307) Feb 20 04:13:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26592 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CBC280000000001030307) Feb 20 04:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26593 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CC4280000000001030307) Feb 20 04:13:09 localhost sshd[146568]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:13:09 localhost systemd-logind[759]: New session 47 of user zuul. Feb 20 04:13:09 localhost systemd[1]: Started Session 47 of User zuul. Feb 20 04:13:10 localhost python3.9[146661]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:13:12 localhost python3.9[146757]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:13:13 localhost python3.9[146811]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 20 04:13:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26594 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CD3E80000000001030307) Feb 20 04:13:17 localhost python3.9[146903]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:13:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9274 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CE48C0000000001030307) Feb 20 04:13:18 localhost python3.9[146996]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9275 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CE8A80000000001030307) Feb 20 04:13:19 localhost python3.9[147088]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:20 localhost python3.9[147180]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:20 localhost python3.9[147270]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:13:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9276 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CF0A80000000001030307) Feb 20 04:13:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26595 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CF3690000000001030307) Feb 20 04:13:21 localhost python3.9[147360]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:13:22 localhost python3.9[147452]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:13:22 localhost sshd[147469]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:13:22 localhost systemd[1]: session-47.scope: Deactivated successfully. Feb 20 04:13:22 localhost systemd[1]: session-47.scope: Consumed 9.130s CPU time. Feb 20 04:13:22 localhost systemd-logind[759]: Session 47 logged out. Waiting for processes to exit. Feb 20 04:13:22 localhost systemd-logind[759]: Removed session 47. Feb 20 04:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9277 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D00690000000001030307) Feb 20 04:13:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1814 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D07300000000001030307) Feb 20 04:13:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1815 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D0B290000000001030307) Feb 20 04:13:28 localhost sshd[147471]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:13:28 localhost systemd-logind[759]: New session 48 of user zuul. Feb 20 04:13:28 localhost systemd[1]: Started Session 48 of User zuul. Feb 20 04:13:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1816 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D13280000000001030307) Feb 20 04:13:29 localhost python3.9[147614]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:13:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47218 DF PROTO=TCP SPT=57806 DPT=9102 SEQ=795303546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D17140000000001030307) Feb 20 04:13:31 localhost sshd[147692]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:13:31 localhost python3.9[147739]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:32 localhost python3.9[147831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:33 localhost python3.9[147904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578812.0421615-180-164270337795508/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9278 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D21690000000001030307) Feb 20 04:13:33 localhost python3.9[147996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:34 localhost python3.9[148088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:34 localhost python3.9[148161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578813.9976404-248-84902633670387/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:35 localhost python3.9[148253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:35 localhost sshd[148281]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:13:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39658 DF PROTO=TCP SPT=53832 DPT=9101 SEQ=499422902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D2D350000000001030307) Feb 20 04:13:36 localhost python3.9[148347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:36 localhost python3.9[148420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578815.842179-320-144077208891245/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:37 localhost python3.9[148512]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:38 localhost python3.9[148604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:38 localhost python3.9[148677]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578817.7024274-395-253454785170199/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39660 DF PROTO=TCP SPT=53832 DPT=9101 SEQ=499422902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D39280000000001030307) Feb 20 04:13:39 localhost python3.9[148769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:39 localhost python3.9[148861]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:40 localhost python3.9[148934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578819.5241244-471-227440666638848/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:41 localhost python3.9[149026]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:41 localhost python3.9[149118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1818 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D43680000000001030307) Feb 20 04:13:42 localhost python3.9[149191]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578821.411394-547-258153400884304/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:43 localhost python3.9[149283]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:43 localhost python3.9[149375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:44 localhost python3.9[149448]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578823.2096717-621-224646440991240/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:44 localhost python3.9[149540]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:13:45 localhost python3.9[149632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14934 DF PROTO=TCP SPT=48324 DPT=9100 SEQ=1107177072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D53680000000001030307) Feb 20 04:13:46 localhost python3.9[149705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578825.145644-695-133933484168486/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:46 localhost systemd[1]: session-48.scope: Deactivated successfully. Feb 20 04:13:46 localhost systemd[1]: session-48.scope: Consumed 11.895s CPU time. Feb 20 04:13:46 localhost systemd-logind[759]: Session 48 logged out. Waiting for processes to exit. Feb 20 04:13:46 localhost systemd-logind[759]: Removed session 48. Feb 20 04:13:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51396 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D59BA0000000001030307) Feb 20 04:13:47 localhost chronyd[139732]: Selected source 23.133.168.246 (pool.ntp.org) Feb 20 04:13:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51398 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D65A80000000001030307) Feb 20 04:13:51 localhost sshd[149721]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:13:51 localhost systemd-logind[759]: New session 49 of user zuul. Feb 20 04:13:51 localhost systemd[1]: Started Session 49 of User zuul. Feb 20 04:13:52 localhost python3.9[149816]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:53 localhost python3.9[149908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:54 localhost python3.9[149981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578832.8325264-59-156393497156177/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=8e2004121a34320613d32710ae37702da8d027e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:54 localhost python3.9[150073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51399 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D75690000000001030307) Feb 20 04:13:55 localhost python3.9[150146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578834.236101-59-16596974300586/.source.conf _original_basename=ceph.conf follow=False checksum=936d449f31af670125791fe297b02d275b2ba4b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:13:55 localhost systemd[1]: session-49.scope: Deactivated successfully. Feb 20 04:13:55 localhost systemd[1]: session-49.scope: Consumed 2.304s CPU time. Feb 20 04:13:55 localhost systemd-logind[759]: Session 49 logged out. Waiting for processes to exit. Feb 20 04:13:55 localhost systemd-logind[759]: Removed session 49. Feb 20 04:13:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42356 DF PROTO=TCP SPT=38558 DPT=9105 SEQ=1121670155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D80680000000001030307) Feb 20 04:13:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42357 DF PROTO=TCP SPT=38558 DPT=9105 SEQ=1121670155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D88680000000001030307) Feb 20 04:14:00 localhost sshd[150161]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:14:00 localhost systemd-logind[759]: New session 50 of user zuul. Feb 20 04:14:00 localhost systemd[1]: Started Session 50 of User zuul. Feb 20 04:14:01 localhost python3.9[150254]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:14:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51400 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D95680000000001030307) Feb 20 04:14:03 localhost python3.9[150350]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:03 localhost python3.9[150442]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:04 localhost python3.9[150532]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:14:05 localhost python3.9[150624]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 20 04:14:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44676 DF PROTO=TCP SPT=54666 DPT=9101 SEQ=1974479732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DA2650000000001030307) Feb 20 04:14:06 localhost sshd[150717]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:14:06 localhost python3.9[150716]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:14:07 localhost python3.9[150772]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44678 DF PROTO=TCP SPT=54666 DPT=9101 SEQ=1974479732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DAE680000000001030307) Feb 20 04:14:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42359 DF PROTO=TCP SPT=38558 DPT=9105 SEQ=1121670155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DB9680000000001030307) Feb 20 04:14:12 localhost python3.9[150867]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:14:13 localhost python3[150962]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Feb 20 04:14:14 localhost python3.9[151054]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:14 localhost python3.9[151146]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:15 localhost python3.9[151194]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:15 localhost python3.9[151286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47665 DF PROTO=TCP SPT=55052 DPT=9100 SEQ=3674430406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DC9680000000001030307) Feb 20 04:14:16 localhost python3.9[151334]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r8tqv3z1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:17 localhost python3.9[151426]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:17 localhost python3.9[151474]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33327 DF PROTO=TCP SPT=53850 DPT=9882 SEQ=2368391008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DCEEA0000000001030307) Feb 20 04:14:19 localhost python3.9[151566]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:14:20 localhost python3[151659]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 20 04:14:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33329 DF PROTO=TCP SPT=53850 DPT=9882 SEQ=2368391008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DDAE80000000001030307) Feb 20 04:14:20 localhost python3.9[151751]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:21 localhost python3.9[151826]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578860.3127856-428-199159928053917/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:22 localhost python3.9[151918]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:22 localhost python3.9[151993]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578861.6026163-474-244494251495405/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:23 localhost python3.9[152085]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:23 localhost python3.9[152160]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578862.8613636-519-157750907599517/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:24 localhost python3.9[152252]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33330 DF PROTO=TCP SPT=53850 DPT=9882 SEQ=2368391008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DEAA80000000001030307) Feb 20 04:14:25 localhost python3.9[152327]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578864.0132675-564-35817176487239/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:25 localhost python3.9[152419]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:26 localhost python3.9[152494]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578865.2043014-608-150432252551954/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:27 localhost python3.9[152586]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55238 DF PROTO=TCP SPT=50838 DPT=9105 SEQ=3095575396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DF5A80000000001030307) Feb 20 04:14:27 localhost python3.9[152678]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:14:28 localhost python3.9[152773]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:29 localhost python3.9[152865]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:14:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55239 DF PROTO=TCP SPT=50838 DPT=9105 SEQ=3095575396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DFDA80000000001030307) Feb 20 04:14:29 localhost python3.9[152958]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:14:30 localhost python3.9[153082]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:14:31 localhost podman[153183]: 2026-02-20 09:14:31.077418463 +0000 UTC m=+0.098397303 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 20 04:14:31 localhost podman[153183]: 2026-02-20 09:14:31.212168477 +0000 UTC m=+0.233147317 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, version=7, io.buildah.version=1.42.2, name=rhceph) Feb 20 04:14:31 localhost python3.9[153270]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:32 localhost python3.9[153465]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:14:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2161 DF PROTO=TCP SPT=50532 DPT=9102 SEQ=761812327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E09680000000001030307) Feb 20 04:14:33 localhost python3.9[153575]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005625204.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:e8:77:41:0b" external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:14:33 localhost ovs-vsctl[153576]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005625204.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:e8:77:41:0b external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Feb 20 04:14:34 localhost python3.9[153668]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:14:35 localhost python3.9[153761]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:14:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64736 DF PROTO=TCP SPT=44892 DPT=9101 SEQ=3976226729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E17950000000001030307) Feb 20 04:14:36 localhost python3.9[153855]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:37 localhost sshd[153948]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:14:37 localhost python3.9[153947]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:38 localhost python3.9[153997]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:38 localhost python3.9[154089]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:39 localhost python3.9[154137]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64738 DF PROTO=TCP SPT=44892 DPT=9101 SEQ=3976226729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E23A80000000001030307) Feb 20 04:14:39 localhost python3.9[154229]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:40 localhost python3.9[154321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:40 localhost python3.9[154369]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:41 localhost python3.9[154461]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55241 DF PROTO=TCP SPT=50838 DPT=9105 SEQ=3095575396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E2D680000000001030307) Feb 20 04:14:41 localhost python3.9[154509]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:42 localhost python3.9[154601]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:14:42 localhost systemd[1]: Reloading. Feb 20 04:14:42 localhost systemd-rc-local-generator[154626]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:14:42 localhost systemd-sysv-generator[154629]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:14:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:14:44 localhost python3.9[154730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:44 localhost python3.9[154778]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:45 localhost python3.9[154870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20070 DF PROTO=TCP SPT=49028 DPT=9100 SEQ=3165136517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E3D680000000001030307) Feb 20 04:14:46 localhost python3.9[154918]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:47 localhost python3.9[155010]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:14:47 localhost systemd[1]: Reloading. Feb 20 04:14:47 localhost systemd-rc-local-generator[155032]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:14:47 localhost systemd-sysv-generator[155036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:14:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:14:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29220 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E441B0000000001030307) Feb 20 04:14:47 localhost systemd[1]: Starting Create netns directory... Feb 20 04:14:47 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 04:14:47 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 04:14:47 localhost systemd[1]: Finished Create netns directory. Feb 20 04:14:49 localhost python3.9[155143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:49 localhost python3.9[155235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:50 localhost python3.9[155308]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578889.4770167-1343-209736468978954/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:50 localhost sshd[155309]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:14:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29222 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E50280000000001030307) Feb 20 04:14:51 localhost python3.9[155402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:52 localhost python3.9[155494]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:14:52 localhost python3.9[155586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:14:53 localhost python3.9[155661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578892.4211583-1442-247753634620878/.source.json _original_basename=.2g6eflyg follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:54 localhost python3.9[155751]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:14:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29223 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E5FE80000000001030307) Feb 20 04:14:56 localhost python3.9[156004]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Feb 20 04:14:57 localhost python3.9[156096]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:14:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28620 DF PROTO=TCP SPT=34900 DPT=9105 SEQ=68265424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E6AE80000000001030307) Feb 20 04:14:58 localhost python3[156188]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:14:59 localhost python3[156188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e",#012 "Digest": "sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:38:56.623500445Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 346422728,#012 "VirtualSize": 346422728,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:033e0289d512b27a678c3feb7195acb9c5f2fbb27c9b2d8c8b5b5f6156f0d11f",#012 "sha256:f848a534c5dfe59c31c3da34c3d2466bdea7e8da7def4225acdd3ffef1544d2f"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:55.650316471Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util- Feb 20 04:14:59 localhost podman[156239]: 2026-02-20 09:14:59.152231438 +0000 UTC m=+0.079357820 container remove 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible) Feb 20 04:14:59 localhost python3[156188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Feb 20 04:14:59 localhost podman[156253]: Feb 20 04:14:59 localhost podman[156253]: 2026-02-20 09:14:59.251059241 +0000 UTC m=+0.081389956 container create 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller) Feb 20 04:14:59 localhost podman[156253]: 2026-02-20 09:14:59.212906384 +0000 UTC m=+0.043237129 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 20 04:14:59 localhost python3[156188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 20 04:14:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28621 DF PROTO=TCP SPT=34900 DPT=9105 SEQ=68265424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E72E80000000001030307) Feb 20 04:14:59 localhost python3.9[156384]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:15:00 localhost python3.9[156479]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:01 localhost python3.9[156525]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:15:01 localhost python3.9[156616]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771578901.1729815-1676-16375319791171/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:02 localhost python3.9[156662]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:15:02 localhost systemd[1]: Reloading. Feb 20 04:15:02 localhost systemd-rc-local-generator[156684]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:15:02 localhost systemd-sysv-generator[156688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:15:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:15:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29224 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E7F680000000001030307) Feb 20 04:15:03 localhost python3.9[156744]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:15:03 localhost systemd[1]: Reloading. Feb 20 04:15:03 localhost systemd-rc-local-generator[156768]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:15:03 localhost systemd-sysv-generator[156772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:15:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:15:03 localhost systemd[1]: Starting ovn_controller container... Feb 20 04:15:03 localhost systemd[1]: Started libcrun container. Feb 20 04:15:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8083c7e517bedac32994ed107decec2298ae4369d578f2e10ab7db3bec0cf06e/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 20 04:15:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:15:03 localhost podman[156786]: 2026-02-20 09:15:03.792025944 +0000 UTC m=+0.159277873 container init 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:15:03 localhost ovn_controller[156798]: + sudo -E kolla_set_configs Feb 20 04:15:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:15:03 localhost podman[156786]: 2026-02-20 09:15:03.837209836 +0000 UTC m=+0.204461765 container start 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:15:03 localhost edpm-start-podman-container[156786]: ovn_controller Feb 20 04:15:03 localhost systemd[1]: Created slice User Slice of UID 0. Feb 20 04:15:03 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 20 04:15:03 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 20 04:15:03 localhost systemd[1]: Starting User Manager for UID 0... Feb 20 04:15:03 localhost edpm-start-podman-container[156785]: Creating additional drop-in dependency for "ovn_controller" (67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f) Feb 20 04:15:03 localhost systemd[1]: Reloading. Feb 20 04:15:04 localhost podman[156806]: 2026-02-20 09:15:04.013601249 +0000 UTC m=+0.168734225 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:15:04 localhost systemd[156827]: Queued start job for default target Main User Target. Feb 20 04:15:04 localhost podman[156806]: 2026-02-20 09:15:04.053193256 +0000 UTC m=+0.208326252 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Feb 20 04:15:04 localhost systemd[156827]: Created slice User Application Slice. Feb 20 04:15:04 localhost systemd[156827]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 20 04:15:04 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 20 04:15:04 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:15:04 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:15:04 localhost systemd[156827]: Started Daily Cleanup of User's Temporary Directories. Feb 20 04:15:04 localhost systemd[156827]: Reached target Paths. Feb 20 04:15:04 localhost systemd[156827]: Reached target Timers. Feb 20 04:15:04 localhost systemd[156827]: Starting D-Bus User Message Bus Socket... Feb 20 04:15:04 localhost systemd[156827]: Starting Create User's Volatile Files and Directories... Feb 20 04:15:04 localhost podman[156806]: unhealthy Feb 20 04:15:04 localhost systemd[156827]: Listening on D-Bus User Message Bus Socket. Feb 20 04:15:04 localhost systemd[156827]: Finished Create User's Volatile Files and Directories. Feb 20 04:15:04 localhost systemd[156827]: Reached target Sockets. Feb 20 04:15:04 localhost systemd[156827]: Reached target Basic System. Feb 20 04:15:04 localhost systemd[156827]: Reached target Main User Target. Feb 20 04:15:04 localhost systemd[156827]: Startup finished in 125ms. Feb 20 04:15:04 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:15:04 localhost systemd-rc-local-generator[156889]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:15:04 localhost systemd-sysv-generator[156893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:15:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:15:04 localhost systemd[1]: Started User Manager for UID 0. Feb 20 04:15:04 localhost systemd[1]: Started ovn_controller container. Feb 20 04:15:04 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:15:04 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Failed with result 'exit-code'. Feb 20 04:15:04 localhost systemd[1]: Started Session c12 of User root. Feb 20 04:15:04 localhost ovn_controller[156798]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:15:04 localhost ovn_controller[156798]: INFO:__main__:Validating config file Feb 20 04:15:04 localhost ovn_controller[156798]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:15:04 localhost ovn_controller[156798]: INFO:__main__:Writing out command to execute Feb 20 04:15:04 localhost systemd[1]: session-c12.scope: Deactivated successfully. Feb 20 04:15:04 localhost ovn_controller[156798]: ++ cat /run_command Feb 20 04:15:04 localhost ovn_controller[156798]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Feb 20 04:15:04 localhost ovn_controller[156798]: + ARGS= Feb 20 04:15:04 localhost ovn_controller[156798]: + sudo kolla_copy_cacerts Feb 20 04:15:04 localhost systemd[1]: Started Session c13 of User root. Feb 20 04:15:04 localhost systemd[1]: session-c13.scope: Deactivated successfully. Feb 20 04:15:04 localhost ovn_controller[156798]: + [[ ! -n '' ]] Feb 20 04:15:04 localhost ovn_controller[156798]: + . kolla_extend_start Feb 20 04:15:04 localhost ovn_controller[156798]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Feb 20 04:15:04 localhost ovn_controller[156798]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Feb 20 04:15:04 localhost ovn_controller[156798]: + umask 0022 Feb 20 04:15:04 localhost ovn_controller[156798]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8] Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00004|main|INFO|OVS IDL reconnected, force recompute. Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00013|main|INFO|OVS feature set changed, force recompute. Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute. Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00021|main|INFO|OVS feature set changed, force recompute. Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-2df8cc-0 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-0c414b-0 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-2275c3-0 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00026|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00027|binding|INFO|Claiming lport e7aa8e2a-27a6-452b-906c-21cea166b882 for this chassis. Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00028|binding|INFO|e7aa8e2a-27a6-452b-906c-21cea166b882: Claiming fa:16:3e:b0:ed:d2 192.168.0.140 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00029|binding|INFO|Removing lport e7aa8e2a-27a6-452b-906c-21cea166b882 ovn-installed in OVS Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00030|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-2df8cc-0 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-0c414b-0 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00033|ovn_bfd|INFO|Enabled BFD on interface ovn-2275c3-0 Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00034|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00035|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00036|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00037|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:04 localhost ovn_controller[156798]: 2026-02-20T09:15:04Z|00038|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:05 localhost python3.9[157000]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:15:05 localhost ovn_controller[156798]: 2026-02-20T09:15:05Z|00039|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:05 localhost ovn_controller[156798]: 2026-02-20T09:15:05Z|00040|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:06 localhost python3.9[157092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:06 localhost ovn_controller[156798]: 2026-02-20T09:15:06Z|00041|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:15:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27168 DF PROTO=TCP SPT=52160 DPT=9101 SEQ=2015171662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E8CC50000000001030307) Feb 20 04:15:06 localhost python3.9[157166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578905.7964308-1811-10320456623650/.source.yaml _original_basename=.tyebha5s follow=False checksum=035aea7be6ab20b22f84818c544954f904d1fea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:07 localhost python3.9[157258]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:15:07 localhost ovs-vsctl[157259]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Feb 20 04:15:08 localhost python3.9[157351]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:15:08 localhost ovs-vsctl[157353]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Feb 20 04:15:09 localhost python3.9[157446]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:15:09 localhost ovs-vsctl[157447]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Feb 20 04:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27170 DF PROTO=TCP SPT=52160 DPT=9101 SEQ=2015171662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E98E80000000001030307) Feb 20 04:15:09 localhost systemd[1]: session-50.scope: Deactivated successfully. Feb 20 04:15:09 localhost systemd[1]: session-50.scope: Consumed 42.127s CPU time. Feb 20 04:15:09 localhost systemd-logind[759]: Session 50 logged out. Waiting for processes to exit. Feb 20 04:15:09 localhost systemd-logind[759]: Removed session 50. Feb 20 04:15:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28623 DF PROTO=TCP SPT=34900 DPT=9105 SEQ=68265424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EA3680000000001030307) Feb 20 04:15:12 localhost ovn_controller[156798]: 2026-02-20T09:15:12Z|00042|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 ovn-installed in OVS Feb 20 04:15:12 localhost ovn_controller[156798]: 2026-02-20T09:15:12Z|00043|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 up in Southbound Feb 20 04:15:14 localhost systemd[1]: Stopping User Manager for UID 0... Feb 20 04:15:14 localhost systemd[156827]: Activating special unit Exit the Session... Feb 20 04:15:14 localhost systemd[156827]: Stopped target Main User Target. Feb 20 04:15:14 localhost systemd[156827]: Stopped target Basic System. Feb 20 04:15:14 localhost systemd[156827]: Stopped target Paths. Feb 20 04:15:14 localhost systemd[156827]: Stopped target Sockets. Feb 20 04:15:14 localhost systemd[156827]: Stopped target Timers. Feb 20 04:15:14 localhost systemd[156827]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 04:15:14 localhost systemd[156827]: Closed D-Bus User Message Bus Socket. Feb 20 04:15:14 localhost systemd[156827]: Stopped Create User's Volatile Files and Directories. Feb 20 04:15:14 localhost systemd[156827]: Removed slice User Application Slice. Feb 20 04:15:14 localhost systemd[156827]: Reached target Shutdown. Feb 20 04:15:14 localhost systemd[156827]: Finished Exit the Session. Feb 20 04:15:14 localhost systemd[156827]: Reached target Exit the Session. Feb 20 04:15:14 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 20 04:15:14 localhost systemd[1]: Stopped User Manager for UID 0. Feb 20 04:15:14 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 20 04:15:14 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 20 04:15:14 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 20 04:15:14 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 20 04:15:14 localhost sshd[157464]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:15:14 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 20 04:15:14 localhost systemd-logind[759]: New session 52 of user zuul. Feb 20 04:15:14 localhost systemd[1]: Started Session 52 of User zuul. Feb 20 04:15:15 localhost python3.9[157557]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:15:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61125 DF PROTO=TCP SPT=42648 DPT=9100 SEQ=1731224291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EB3690000000001030307) Feb 20 04:15:17 localhost python3.9[157653]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:17 localhost python3.9[157745]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61090 DF PROTO=TCP SPT=53842 DPT=9882 SEQ=3464857739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EB94A0000000001030307) Feb 20 04:15:18 localhost python3.9[157837]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:18 localhost python3.9[157929]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:19 localhost python3.9[158021]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:20 localhost python3.9[158111]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:15:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61092 DF PROTO=TCP SPT=53842 DPT=9882 SEQ=3464857739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EC5680000000001030307) Feb 20 04:15:20 localhost python3.9[158203]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 20 04:15:21 localhost python3.9[158293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:22 localhost python3.9[158367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578921.304057-215-173553106590839/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:23 localhost python3.9[158457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:24 localhost python3.9[158530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578923.0118837-261-28122298995605/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61093 DF PROTO=TCP SPT=53842 DPT=9882 SEQ=3464857739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598ED5290000000001030307) Feb 20 04:15:24 localhost python3.9[158622]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:15:25 localhost python3.9[158676]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2765 DF PROTO=TCP SPT=60428 DPT=9105 SEQ=3726470640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EDFE90000000001030307) Feb 20 04:15:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2766 DF PROTO=TCP SPT=60428 DPT=9105 SEQ=3726470640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EE7E80000000001030307) Feb 20 04:15:30 localhost python3.9[158770]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:15:32 localhost python3.9[158863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:32 localhost python3.9[158934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578931.6544049-371-5781413151881/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35328 DF PROTO=TCP SPT=53118 DPT=9102 SEQ=3089569175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EF3680000000001030307) Feb 20 04:15:33 localhost python3.9[159025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:33 localhost python3.9[159140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578932.7195594-371-194159434887944/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:15:34 localhost podman[159231]: 2026-02-20 09:15:34.658609342 +0000 UTC m=+0.085493349 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller) Feb 20 04:15:34 localhost ovn_controller[156798]: 2026-02-20T09:15:34Z|00044|memory|INFO|19064 kB peak resident set size after 30.2 seconds Feb 20 04:15:34 localhost ovn_controller[156798]: 2026-02-20T09:15:34Z|00045|memory|INFO|idl-cells-OVN_Southbound:4072 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:80 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:348 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:157 ofctrl_installed_flow_usage-KB:114 ofctrl_sb_flow_ref_usage-KB:68 Feb 20 04:15:34 localhost podman[159231]: 2026-02-20 09:15:34.722252349 +0000 UTC m=+0.149136336 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:15:34 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:15:34 localhost sshd[159289]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:15:34 localhost python3.9[159282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:35 localhost python3.9[159361]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578934.4460106-504-83461560113569/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:35 localhost python3.9[159451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50310 DF PROTO=TCP SPT=36576 DPT=9101 SEQ=1287031837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F01F50000000001030307) Feb 20 04:15:36 localhost python3.9[159522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578935.503693-504-29905404860244/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:36 localhost python3.9[159612]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:15:37 localhost python3.9[159706]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:38 localhost python3.9[159798]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:38 localhost python3.9[159846]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50312 DF PROTO=TCP SPT=36576 DPT=9101 SEQ=1287031837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F0DE80000000001030307) Feb 20 04:15:39 localhost python3.9[159938]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:40 localhost python3.9[159986]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:40 localhost python3.9[160078]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:41 localhost python3.9[160170]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2768 DF PROTO=TCP SPT=60428 DPT=9105 SEQ=3726470640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F17680000000001030307) Feb 20 04:15:41 localhost python3.9[160218]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:42 localhost python3.9[160310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:42 localhost ovn_controller[156798]: 2026-02-20T09:15:42Z|00046|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory Feb 20 04:15:43 localhost python3.9[160358]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:43 localhost python3.9[160450]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:15:43 localhost systemd[1]: Reloading. Feb 20 04:15:44 localhost systemd-sysv-generator[160478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:15:44 localhost systemd-rc-local-generator[160473]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:15:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:15:45 localhost python3.9[160580]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34196 DF PROTO=TCP SPT=35094 DPT=9102 SEQ=887650660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F27680000000001030307) Feb 20 04:15:45 localhost python3.9[160628]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:46 localhost python3.9[160720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:46 localhost python3.9[160768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57035 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F2E7A0000000001030307) Feb 20 04:15:47 localhost python3.9[160860]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:15:47 localhost systemd[1]: Reloading. Feb 20 04:15:47 localhost systemd-rc-local-generator[160883]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:15:47 localhost systemd-sysv-generator[160887]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:15:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:15:48 localhost systemd[1]: Starting Create netns directory... Feb 20 04:15:48 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 04:15:48 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 04:15:48 localhost systemd[1]: Finished Create netns directory. Feb 20 04:15:48 localhost python3.9[160995]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:49 localhost python3.9[161087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:50 localhost python3.9[161160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578949.110569-956-145697128815745/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57037 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F3A690000000001030307) Feb 20 04:15:50 localhost python3.9[161252]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:51 localhost python3.9[161344]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:15:52 localhost python3.9[161436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:15:52 localhost python3.9[161511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578951.8998153-1055-39258560950211/.source.json _original_basename=.qk7skhyr follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:53 localhost python3.9[161601]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57038 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F4A280000000001030307) Feb 20 04:15:55 localhost python3.9[161854]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Feb 20 04:15:57 localhost python3.9[161946]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:15:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11200 DF PROTO=TCP SPT=45326 DPT=9105 SEQ=1923906977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F55280000000001030307) Feb 20 04:15:58 localhost python3[162038]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:15:58 localhost python3[162038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8",#012 "Digest": "sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:29:34.446261637Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 785500417,#012 "VirtualSize": 785500417,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc/diff:/var/lib/containers/storage/overlay/33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:d3cc9cdab7e3e7c1a0a6c80e61bbd8cc5eeeba7069bab1cc064ed2e6cc28ed58",#012 "sha256:d5cbf3016eca6267717119e8ebab3c6c083cae6c589c6961ae23bfa93ef3afa4",#012 "sha256:0096ee5d07436ac5b94d9d58b8b2407cc5e6854d70de5e7f89b9a7a1ad4912ad"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.con Feb 20 04:15:58 localhost podman[162090]: 2026-02-20 09:15:58.412687628 +0000 UTC m=+0.089762231 container remove 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 20 04:15:58 localhost python3[162038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Feb 20 04:15:58 localhost podman[162103]: Feb 20 04:15:58 localhost podman[162103]: 2026-02-20 09:15:58.523267929 +0000 UTC m=+0.093679764 container create ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:15:58 localhost podman[162103]: 2026-02-20 09:15:58.47757017 +0000 UTC m=+0.047982105 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 20 04:15:58 localhost python3[162038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 20 04:15:59 localhost python3.9[162234]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:15:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11201 DF PROTO=TCP SPT=45326 DPT=9105 SEQ=1923906977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F5D280000000001030307) Feb 20 04:16:01 localhost python3.9[162328]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:01 localhost python3.9[162374]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:16:02 localhost python3.9[162465]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771578961.7971914-1289-214797618670992/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:02 localhost python3.9[162511]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:16:02 localhost systemd[1]: Reloading. Feb 20 04:16:03 localhost systemd-sysv-generator[162542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:16:03 localhost systemd-rc-local-generator[162539]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:16:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57039 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F6B680000000001030307) Feb 20 04:16:03 localhost python3.9[162593]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:03 localhost systemd[1]: Reloading. Feb 20 04:16:03 localhost systemd-sysv-generator[162625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:16:03 localhost systemd-rc-local-generator[162620]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:16:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:04 localhost systemd[1]: Starting ovn_metadata_agent container... Feb 20 04:16:04 localhost systemd[1]: tmp-crun.7PUSCp.mount: Deactivated successfully. Feb 20 04:16:04 localhost systemd[1]: Started libcrun container. Feb 20 04:16:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a2e6d660eb1e53ecd61332ea1a4a8a42043dad2af13c4e9eca5a18e5c0fd3f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 20 04:16:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a2e6d660eb1e53ecd61332ea1a4a8a42043dad2af13c4e9eca5a18e5c0fd3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:16:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:16:04 localhost podman[162635]: 2026-02-20 09:16:04.309887588 +0000 UTC m=+0.150666408 container init ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + sudo -E kolla_set_configs Feb 20 04:16:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:16:04 localhost podman[162635]: 2026-02-20 09:16:04.356301588 +0000 UTC m=+0.197080348 container start ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:16:04 localhost edpm-start-podman-container[162635]: ovn_metadata_agent Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Validating config file Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Copying service configuration files Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Writing out command to execute Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: ++ cat /run_command Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + CMD=neutron-ovn-metadata-agent Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + ARGS= Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + sudo kolla_copy_cacerts Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: Running command: 'neutron-ovn-metadata-agent' Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + [[ ! -n '' ]] Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + . kolla_extend_start Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + umask 0022 Feb 20 04:16:04 localhost ovn_metadata_agent[162647]: + exec neutron-ovn-metadata-agent Feb 20 04:16:04 localhost podman[162655]: 2026-02-20 09:16:04.449669042 +0000 UTC m=+0.088737812 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:16:04 localhost edpm-start-podman-container[162634]: Creating additional drop-in dependency for "ovn_metadata_agent" (ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916) Feb 20 04:16:04 localhost systemd[1]: Reloading. Feb 20 04:16:04 localhost podman[162655]: 2026-02-20 09:16:04.531476552 +0000 UTC m=+0.170545312 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:16:04 localhost systemd-rc-local-generator[162722]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:16:04 localhost systemd-sysv-generator[162726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:16:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:04 localhost systemd[1]: tmp-crun.uPmUMk.mount: Deactivated successfully. Feb 20 04:16:04 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:16:04 localhost systemd[1]: Started ovn_metadata_agent container. Feb 20 04:16:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:16:04 localhost podman[162737]: 2026-02-20 09:16:04.943226062 +0000 UTC m=+0.084458098 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller) Feb 20 04:16:05 localhost podman[162737]: 2026-02-20 09:16:05.008224548 +0000 UTC m=+0.149456624 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:16:05 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.927 162652 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.927 162652 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.927 162652 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.971 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Feb 20 04:16:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:05.986 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e6b84e4d-7dff-4c2c-96db-c41e3ef520c6 (UUID: e6b84e4d-7dff-4c2c-96db-c41e3ef520c6) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.002 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.003 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.011 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:ed:d2 192.168.0.140'], port_security=['fa:16:3e:b0:ed:d2 192.168.0.140'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.140/24', 'neutron:device_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005625204.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de929a91-c460-4398-96e0-15a80685a485', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '91bce661d685472eb3e7cacab17bf52a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '571bc6f6-22b1-4aad-9b70-3481475089c6 dd806cfc-5243-4295-bd9f-cfd9f58a9f1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee1d7cd7-5f4f-4b75-a06c-f37c0ef97c77, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e7aa8e2a-27a6-452b-906c-21cea166b882) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.012 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e6b84e4d-7dff-4c2c-96db-c41e3ef520c6'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': '5583b60b-563a-5b85-8f5f-a322cc499504', 'neutron:ovn-metadata-sb-cfg': '1'}, name=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, nb_cfg_timestamp=1771578913263, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.012 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e7aa8e2a-27a6-452b-906c-21cea166b882 in datapath de929a91-c460-4398-96e0-15a80685a485 bound to our chassis on insert#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.013 162652 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.014 162652 INFO oslo_service.service [-] Starting 1 workers#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.016 162652 DEBUG oslo_service.service [-] Started child 162777 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.019 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de929a91-c460-4398-96e0-15a80685a485#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.020 162652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpkw_qn71r/privsep.sock']#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.020 162777 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-383389'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.045 162777 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.045 162777 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.046 162777 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.049 162777 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.052 162777 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.064 162777 INFO eventlet.wsgi.server [-] (162777) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Feb 20 04:16:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2048 DF PROTO=TCP SPT=49734 DPT=9101 SEQ=3173854424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F77250000000001030307) Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.670 162652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.671 162652 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkw_qn71r/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.551 162782 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.557 162782 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.561 162782 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.561 162782 INFO oslo.privsep.daemon [-] privsep daemon running as pid 162782#033[00m Feb 20 04:16:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:06.674 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fa6b76-f103-4179-83dd-e50cb02b89de]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:07.103 162782 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:16:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:07.103 162782 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:16:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:07.103 162782 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:16:07 localhost python3.9[162862]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:16:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:07.574 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[497afd09-98b7-4178-a4d5-03a8b33b2c77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:07.576 162652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp4kurx1fa/privsep.sock']#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.167 162652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.168 162652 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp4kurx1fa/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.048 162915 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.052 162915 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.055 162915 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.055 162915 INFO oslo.privsep.daemon [-] privsep daemon running as pid 162915#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.171 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef5062e-311b-4182-8cb0-cd8f575de127]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:08 localhost python3.9[162964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.610 162915 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.610 162915 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:16:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:08.611 162915 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:16:09 localhost python3.9[163040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578967.9937506-1424-172256819390940/.source.yaml _original_basename=.7diqfqtd follow=False checksum=00f5f1349c1b2f1d82b680e3efe9b7b384555dee backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.079 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[4036496c-845b-4350-a15c-c529a8957301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.082 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[462ca290-4442-49fc-84b4-8dcbe7c61ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.102 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[794e88bb-b6e6-44ca-bd44-5575295d349d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.115 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[72ad446d-c238-483c-af44-f9b58f550c1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde929a91-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:09:c2:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637504, 'reachable_time': 41962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 163060, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.129 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[31fd05b8-49b2-4987-809e-984a2b0b1bf6]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapde929a91-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637514, 'tstamp': 637514}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapde929a91-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637516, 'tstamp': 637516}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637512, 'tstamp': 637512}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:c288'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637504, 'tstamp': 637504}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.181 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[2b61d1eb-9fa9-4b12-86c4-03df96583e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.183 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde929a91-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.188 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde929a91-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.188 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.189 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde929a91-c0, col_values=(('external_ids', {'iface-id': '3323e11d-576a-42f3-bcca-e10425268e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.190 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.194 162652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp6c2bt20r/privsep.sock']#033[00m Feb 20 04:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2050 DF PROTO=TCP SPT=49734 DPT=9101 SEQ=3173854424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F83280000000001030307) Feb 20 04:16:09 localhost systemd[1]: session-52.scope: Deactivated successfully. Feb 20 04:16:09 localhost systemd[1]: session-52.scope: Consumed 32.822s CPU time. Feb 20 04:16:09 localhost systemd-logind[759]: Session 52 logged out. Waiting for processes to exit. Feb 20 04:16:09 localhost systemd-logind[759]: Removed session 52. Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.784 162652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.786 162652 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6c2bt20r/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.669 163070 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.675 163070 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.679 163070 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.679 163070 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163070#033[00m Feb 20 04:16:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:09.789 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[74590516-1d7b-4131-b7f3-e14068df2bfb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.237 163070 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.237 163070 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.238 163070 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.697 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[58177eb7-4569-40df-b1b8-d83039047cd2]: (4, ['ovnmeta-de929a91-c460-4398-96e0-15a80685a485']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.701 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, column=external_ids, values=({'neutron:ovn-metadata-id': '5583b60b-563a-5b85-8f5f-a322cc499504'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.702 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.703 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.712 162652 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.712 162652 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.712 162652 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.720 162652 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.720 162652 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.720 162652 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:16:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 20 04:16:11 localhost sshd[163075]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:16:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11203 DF PROTO=TCP SPT=45326 DPT=9105 SEQ=1923906977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F8D680000000001030307) Feb 20 04:16:14 localhost sshd[163077]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:16:14 localhost systemd-logind[759]: New session 53 of user zuul. Feb 20 04:16:14 localhost systemd[1]: Started Session 53 of User zuul. Feb 20 04:16:15 localhost python3.9[163170]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:16:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21837 DF PROTO=TCP SPT=52104 DPT=9100 SEQ=504754750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F9D690000000001030307) Feb 20 04:16:16 localhost python3.9[163266]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:17 localhost python3.9[163371]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12391 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FA3AA0000000001030307) Feb 20 04:16:17 localhost systemd[1]: libpod-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def.scope: Deactivated successfully. Feb 20 04:16:17 localhost podman[163372]: 2026-02-20 09:16:17.758031796 +0000 UTC m=+0.073570094 container died 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git) Feb 20 04:16:17 localhost podman[163372]: 2026-02-20 09:16:17.78556363 +0000 UTC m=+0.101101918 container cleanup 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.) Feb 20 04:16:17 localhost podman[163385]: 2026-02-20 09:16:17.841121032 +0000 UTC m=+0.077420324 container remove 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.expose-services=, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13) Feb 20 04:16:17 localhost systemd[1]: libpod-conmon-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def.scope: Deactivated successfully. Feb 20 04:16:18 localhost systemd[1]: var-lib-containers-storage-overlay-0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9-merged.mount: Deactivated successfully. Feb 20 04:16:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def-userdata-shm.mount: Deactivated successfully. Feb 20 04:16:18 localhost python3.9[163493]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:16:18 localhost systemd[1]: Reloading. Feb 20 04:16:19 localhost systemd-sysv-generator[163519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:16:19 localhost systemd-rc-local-generator[163515]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:16:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:20 localhost python3.9[163619]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:16:20 localhost network[163636]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:16:20 localhost network[163637]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:16:20 localhost network[163638]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:16:21 localhost sshd[163649]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:16:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12393 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FAFA80000000001030307) Feb 20 04:16:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:24 localhost python3.9[163841]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:24 localhost systemd[1]: Reloading. Feb 20 04:16:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12394 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FBF680000000001030307) Feb 20 04:16:24 localhost systemd-rc-local-generator[163868]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:16:24 localhost systemd-sysv-generator[163873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:16:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:25 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Feb 20 04:16:25 localhost python3.9[163973]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:26 localhost python3.9[164066]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:27 localhost python3.9[164159]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17314 DF PROTO=TCP SPT=39374 DPT=9105 SEQ=4190471795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FCA680000000001030307) Feb 20 04:16:28 localhost python3.9[164252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:28 localhost python3.9[164345]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17315 DF PROTO=TCP SPT=39374 DPT=9105 SEQ=4190471795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FD2680000000001030307) Feb 20 04:16:30 localhost sshd[164439]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:16:30 localhost python3.9[164438]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:16:32 localhost python3.9[164533]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12395 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FDF680000000001030307) Feb 20 04:16:33 localhost python3.9[164625]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:33 localhost sshd[164672]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:16:33 localhost python3.9[164719]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:34 localhost python3.9[164811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:34 localhost python3.9[164917]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:16:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:16:35 localhost podman[164993]: 2026-02-20 09:16:35.150103725 +0000 UTC m=+0.081245385 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:16:35 localhost podman[164993]: 2026-02-20 09:16:35.158516167 +0000 UTC m=+0.089657827 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:16:35 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:16:35 localhost podman[164992]: 2026-02-20 09:16:35.204975998 +0000 UTC m=+0.136723276 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller) Feb 20 04:16:35 localhost podman[164992]: 2026-02-20 09:16:35.242050098 +0000 UTC m=+0.173797376 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:16:35 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:16:35 localhost python3.9[165087]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:36 localhost python3.9[165191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28772 DF PROTO=TCP SPT=58014 DPT=9101 SEQ=813715877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FEC550000000001030307) Feb 20 04:16:36 localhost python3.9[165298]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:37 localhost python3.9[165390]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:37 localhost python3.9[165482]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:38 localhost python3.9[165574]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:39 localhost python3.9[165666]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28774 DF PROTO=TCP SPT=58014 DPT=9101 SEQ=813715877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FF8690000000001030307) Feb 20 04:16:39 localhost python3.9[165758]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:40 localhost python3.9[165850]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:16:41 localhost python3.9[165942]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:41 localhost python3.9[166034]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:16:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17317 DF PROTO=TCP SPT=39374 DPT=9105 SEQ=4190471795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599003680000000001030307) Feb 20 04:16:42 localhost python3.9[166126]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:16:42 localhost sshd[166127]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:16:42 localhost systemd[1]: Reloading. Feb 20 04:16:42 localhost systemd-rc-local-generator[166152]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:16:42 localhost systemd-sysv-generator[166159]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:16:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:16:43 localhost python3.9[166256]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:44 localhost python3.9[166349]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:45 localhost python3.9[166442]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:45 localhost python3.9[166535]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8053 DF PROTO=TCP SPT=33848 DPT=9102 SEQ=2895410563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599011680000000001030307) Feb 20 04:16:46 localhost python3.9[166628]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:46 localhost python3.9[166721]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:47 localhost python3.9[166814]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:16:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25090 DF PROTO=TCP SPT=47008 DPT=9882 SEQ=3729068539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599018DA0000000001030307) Feb 20 04:16:49 localhost python3.9[166907]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Feb 20 04:16:50 localhost python3.9[167000]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 20 04:16:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25092 DF PROTO=TCP SPT=47008 DPT=9882 SEQ=3729068539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599024E80000000001030307) Feb 20 04:16:51 localhost python3.9[167098]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Feb 20 04:16:52 localhost python3.9[167198]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:16:53 localhost python3.9[167252]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25093 DF PROTO=TCP SPT=47008 DPT=9882 SEQ=3729068539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599034A80000000001030307) Feb 20 04:16:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53974 DF PROTO=TCP SPT=42128 DPT=9105 SEQ=3538631678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59903FA80000000001030307) Feb 20 04:16:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53975 DF PROTO=TCP SPT=42128 DPT=9105 SEQ=3538631678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599047A80000000001030307) Feb 20 04:17:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28627 DF PROTO=TCP SPT=54526 DPT=9100 SEQ=1245713767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599053680000000001030307) Feb 20 04:17:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:17:05.973 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:17:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:17:05.975 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:17:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:17:05.977 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:17:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:17:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:17:06 localhost podman[167324]: 2026-02-20 09:17:06.14360864 +0000 UTC m=+0.069570630 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:17:06 localhost podman[167324]: 2026-02-20 09:17:06.147342134 +0000 UTC m=+0.073304084 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:17:06 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:17:06 localhost podman[167323]: 2026-02-20 09:17:06.213038664 +0000 UTC m=+0.137042283 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:17:06 localhost podman[167323]: 2026-02-20 09:17:06.247207749 +0000 UTC m=+0.171211368 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:17:06 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:17:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41854 DF PROTO=TCP SPT=45660 DPT=9101 SEQ=3022912987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599061850000000001030307) Feb 20 04:17:07 localhost sshd[167364]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41856 DF PROTO=TCP SPT=45660 DPT=9101 SEQ=3022912987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59906DA90000000001030307) Feb 20 04:17:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53977 DF PROTO=TCP SPT=42128 DPT=9105 SEQ=3538631678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599077680000000001030307) Feb 20 04:17:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22578 DF PROTO=TCP SPT=42162 DPT=9100 SEQ=3643579773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599087680000000001030307) Feb 20 04:17:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19322 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59908E0B0000000001030307) Feb 20 04:17:18 localhost kernel: SELinux: Converting 2759 SID table entries... Feb 20 04:17:18 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Feb 20 04:17:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:17:18 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:17:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:17:18 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:17:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:17:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:17:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:17:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19324 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59909A280000000001030307) Feb 20 04:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19325 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990A9E80000000001030307) Feb 20 04:17:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30192 DF PROTO=TCP SPT=41982 DPT=9105 SEQ=1281184047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990B4A80000000001030307) Feb 20 04:17:28 localhost kernel: SELinux: Converting 2762 SID table entries... Feb 20 04:17:28 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:17:28 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:17:28 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:17:28 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:17:28 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:17:28 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:17:28 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:17:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30193 DF PROTO=TCP SPT=41982 DPT=9105 SEQ=1281184047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990BCA80000000001030307) Feb 20 04:17:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19326 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990C9680000000001030307) Feb 20 04:17:36 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=20 res=1 Feb 20 04:17:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:17:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:17:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27263 DF PROTO=TCP SPT=35370 DPT=9101 SEQ=3360427931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990D6B50000000001030307) Feb 20 04:17:36 localhost systemd[1]: tmp-crun.rA68cv.mount: Deactivated successfully. Feb 20 04:17:36 localhost systemd[1]: tmp-crun.8zysPZ.mount: Deactivated successfully. Feb 20 04:17:36 localhost podman[168467]: 2026-02-20 09:17:36.425199329 +0000 UTC m=+0.150949712 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:17:36 localhost podman[168466]: 2026-02-20 09:17:36.389159556 +0000 UTC m=+0.115247230 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:17:36 localhost podman[168466]: 2026-02-20 09:17:36.472077537 +0000 UTC m=+0.198165181 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:17:36 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:17:36 localhost podman[168467]: 2026-02-20 09:17:36.501943169 +0000 UTC m=+0.227693542 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:17:36 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:17:38 localhost kernel: SELinux: Converting 2765 SID table entries... Feb 20 04:17:38 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:17:38 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:17:38 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:17:38 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:17:38 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:17:38 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:17:38 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27265 DF PROTO=TCP SPT=35370 DPT=9101 SEQ=3360427931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990E2A80000000001030307) Feb 20 04:17:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30195 DF PROTO=TCP SPT=41982 DPT=9105 SEQ=1281184047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990ED680000000001030307) Feb 20 04:17:44 localhost sshd[168588]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:17:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22069 DF PROTO=TCP SPT=35078 DPT=9100 SEQ=282675773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990FD680000000001030307) Feb 20 04:17:46 localhost kernel: SELinux: Converting 2765 SID table entries... Feb 20 04:17:46 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:17:46 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:17:46 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:17:46 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:17:46 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:17:46 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:17:46 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:17:47 localhost systemd[1]: Reloading. Feb 20 04:17:47 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=22 res=1 Feb 20 04:17:47 localhost systemd-rc-local-generator[168619]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:17:47 localhost systemd-sysv-generator[168627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:17:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:17:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29435 DF PROTO=TCP SPT=34378 DPT=9882 SEQ=3238976354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991033A0000000001030307) Feb 20 04:17:47 localhost systemd[1]: Reloading. Feb 20 04:17:47 localhost systemd-rc-local-generator[168659]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:17:47 localhost systemd-sysv-generator[168664]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:17:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:17:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29437 DF PROTO=TCP SPT=34378 DPT=9882 SEQ=3238976354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59910F290000000001030307) Feb 20 04:17:53 localhost sshd[168679]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29438 DF PROTO=TCP SPT=34378 DPT=9882 SEQ=3238976354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59911EE90000000001030307) Feb 20 04:17:56 localhost kernel: SELinux: Converting 2766 SID table entries... Feb 20 04:17:56 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 20 04:17:56 localhost kernel: SELinux: policy capability open_perms=1 Feb 20 04:17:56 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 20 04:17:56 localhost kernel: SELinux: policy capability always_check_network=0 Feb 20 04:17:56 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 20 04:17:56 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 20 04:17:56 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 20 04:17:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39582 DF PROTO=TCP SPT=59270 DPT=9105 SEQ=1688412850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599129E80000000001030307) Feb 20 04:17:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39583 DF PROTO=TCP SPT=59270 DPT=9105 SEQ=1688412850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599131E80000000001030307) Feb 20 04:18:00 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 20 04:18:00 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=23 res=1 Feb 20 04:18:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28096 DF PROTO=TCP SPT=39276 DPT=9102 SEQ=3486674986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59913D680000000001030307) Feb 20 04:18:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:18:05.973 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:18:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:18:05.974 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:18:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:18:05.976 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:18:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27872 DF PROTO=TCP SPT=50366 DPT=9101 SEQ=3142144691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59914BE50000000001030307) Feb 20 04:18:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:18:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:18:07 localhost systemd[1]: tmp-crun.bkPpJq.mount: Deactivated successfully. Feb 20 04:18:07 localhost podman[168923]: 2026-02-20 09:18:07.171523127 +0000 UTC m=+0.094366433 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Feb 20 04:18:07 localhost podman[168922]: 2026-02-20 09:18:07.221410884 +0000 UTC m=+0.145938312 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller) Feb 20 04:18:07 localhost podman[168923]: 2026-02-20 09:18:07.252695737 +0000 UTC m=+0.175539053 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:18:07 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:18:07 localhost podman[168922]: 2026-02-20 09:18:07.269047924 +0000 UTC m=+0.193575392 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:18:07 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27874 DF PROTO=TCP SPT=50366 DPT=9101 SEQ=3142144691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599157E80000000001030307) Feb 20 04:18:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39585 DF PROTO=TCP SPT=59270 DPT=9105 SEQ=1688412850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599161680000000001030307) Feb 20 04:18:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45680 DF PROTO=TCP SPT=36640 DPT=9102 SEQ=1733988649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599171680000000001030307) Feb 20 04:18:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58737 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599178800000000001030307) Feb 20 04:18:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58739 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599184A90000000001030307) Feb 20 04:18:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58740 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599194680000000001030307) Feb 20 04:18:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48292 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=2261576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59919F280000000001030307) Feb 20 04:18:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48293 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=2261576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991A7280000000001030307) Feb 20 04:18:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58741 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991B5680000000001030307) Feb 20 04:18:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25492 DF PROTO=TCP SPT=41030 DPT=9101 SEQ=1941935264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991C1150000000001030307) Feb 20 04:18:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:18:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:18:37 localhost podman[186038]: 2026-02-20 09:18:37.97060908 +0000 UTC m=+0.090496935 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:18:38 localhost podman[186039]: 2026-02-20 09:18:38.018671332 +0000 UTC m=+0.134977078 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:18:38 localhost podman[186039]: 2026-02-20 09:18:38.028954495 +0000 UTC m=+0.145260201 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true) Feb 20 04:18:38 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:18:38 localhost podman[186038]: 2026-02-20 09:18:38.084085483 +0000 UTC m=+0.203973378 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:18:38 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:18:38 localhost sshd[186115]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:18:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25494 DF PROTO=TCP SPT=41030 DPT=9101 SEQ=1941935264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991CD280000000001030307) Feb 20 04:18:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48295 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=2261576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991D7680000000001030307) Feb 20 04:18:42 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 20 04:18:42 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 20 04:18:42 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 20 04:18:42 localhost systemd[1]: sshd.service: Consumed 2.379s CPU time, read 32.0K from disk, written 0B to disk. Feb 20 04:18:42 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 20 04:18:42 localhost systemd[1]: Stopping sshd-keygen.target... Feb 20 04:18:42 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:18:42 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:18:42 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 20 04:18:42 localhost systemd[1]: Reached target sshd-keygen.target. Feb 20 04:18:42 localhost systemd[1]: Starting OpenSSH server daemon... Feb 20 04:18:42 localhost sshd[186832]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:18:42 localhost systemd[1]: Started OpenSSH server daemon. Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:18:44 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 04:18:44 localhost systemd[1]: Reloading. Feb 20 04:18:44 localhost systemd-sysv-generator[187084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:44 localhost systemd-rc-local-generator[187078]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:44 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 04:18:44 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:18:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34022 DF PROTO=TCP SPT=49436 DPT=9102 SEQ=3458103529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991E7680000000001030307) Feb 20 04:18:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37974 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991ED9A0000000001030307) Feb 20 04:18:48 localhost python3.9[191926]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:18:48 localhost systemd[1]: Reloading. Feb 20 04:18:48 localhost systemd-sysv-generator[192344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:48 localhost systemd-rc-local-generator[192338]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:49 localhost sshd[193064]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:18:49 localhost python3.9[193012]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:18:49 localhost systemd[1]: Reloading. Feb 20 04:18:50 localhost systemd-sysv-generator[193158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:50 localhost systemd-rc-local-generator[193154]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37976 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991F9A80000000001030307) Feb 20 04:18:51 localhost python3.9[194090]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:18:52 localhost systemd[1]: Reloading. Feb 20 04:18:52 localhost systemd-sysv-generator[194279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:52 localhost systemd-rc-local-generator[194276]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost python3.9[194776]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:18:53 localhost systemd[1]: Reloading. Feb 20 04:18:53 localhost systemd-sysv-generator[195002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:53 localhost systemd-rc-local-generator[194997]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:53 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37977 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599209690000000001030307) Feb 20 04:18:55 localhost python3.9[195845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:18:55 localhost systemd[1]: Reloading. Feb 20 04:18:55 localhost systemd-sysv-generator[196085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:55 localhost systemd-rc-local-generator[196080]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:55 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 04:18:55 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 04:18:55 localhost systemd[1]: man-db-cache-update.service: Consumed 13.952s CPU time. Feb 20 04:18:55 localhost systemd[1]: run-r2a2a39551e3b4b5b9c3115cb5b5ff1b8.service: Deactivated successfully. Feb 20 04:18:55 localhost systemd[1]: run-rbcb2b767c39941cf8bc81e61264cc9b0.service: Deactivated successfully. Feb 20 04:18:56 localhost python3.9[196417]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:18:56 localhost systemd[1]: Reloading. Feb 20 04:18:56 localhost systemd-sysv-generator[196457]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:56 localhost systemd-rc-local-generator[196451]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:56 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost python3.9[196574]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:18:57 localhost systemd[1]: Reloading. Feb 20 04:18:57 localhost systemd-sysv-generator[196608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:18:57 localhost systemd-rc-local-generator[196603]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39099 DF PROTO=TCP SPT=44092 DPT=9105 SEQ=2269906091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599214690000000001030307) Feb 20 04:18:58 localhost python3.9[196723]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:18:59 localhost python3.9[196836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:18:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39100 DF PROTO=TCP SPT=44092 DPT=9105 SEQ=2269906091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59921C690000000001030307) Feb 20 04:19:00 localhost systemd[1]: Reloading. Feb 20 04:19:00 localhost systemd-sysv-generator[196870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:19:00 localhost systemd-rc-local-generator[196863]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37978 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599229680000000001030307) Feb 20 04:19:03 localhost python3.9[196985]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:19:03 localhost systemd[1]: Reloading. Feb 20 04:19:03 localhost systemd-rc-local-generator[197010]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:19:04 localhost systemd-sysv-generator[197015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:04 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:19:05 localhost python3.9[197133]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:05 localhost python3.9[197246]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:19:05.974 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:19:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:19:05.975 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:19:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:19:05.976 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:19:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63702 DF PROTO=TCP SPT=39996 DPT=9101 SEQ=1291160012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599236450000000001030307) Feb 20 04:19:06 localhost python3.9[197359]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:19:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:19:08 localhost podman[197474]: 2026-02-20 09:19:08.281664599 +0000 UTC m=+0.093900819 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:19:08 localhost podman[197474]: 2026-02-20 09:19:08.291097836 +0000 UTC m=+0.103334086 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:19:08 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:19:08 localhost podman[197473]: 2026-02-20 09:19:08.385901421 +0000 UTC m=+0.198274185 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 20 04:19:08 localhost podman[197473]: 2026-02-20 09:19:08.424935899 +0000 UTC m=+0.237308663 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:19:08 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:19:08 localhost python3.9[197472]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:09 localhost python3.9[197629]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63704 DF PROTO=TCP SPT=39996 DPT=9101 SEQ=1291160012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599242680000000001030307) Feb 20 04:19:10 localhost python3.9[197742]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:10 localhost python3.9[197855]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:11 localhost python3.9[197968]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39102 DF PROTO=TCP SPT=44092 DPT=9105 SEQ=2269906091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59924D690000000001030307) Feb 20 04:19:12 localhost python3.9[198081]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:13 localhost python3.9[198194]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:14 localhost python3.9[198307]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:15 localhost python3.9[198420]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6504 DF PROTO=TCP SPT=46248 DPT=9102 SEQ=1483972815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59925B690000000001030307) Feb 20 04:19:16 localhost python3.9[198533]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:17 localhost python3.9[198646]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 20 04:19:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21087 DF PROTO=TCP SPT=38838 DPT=9882 SEQ=64419992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599262CB0000000001030307) Feb 20 04:19:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21089 DF PROTO=TCP SPT=38838 DPT=9882 SEQ=64419992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59926EE90000000001030307) Feb 20 04:19:22 localhost python3.9[198759]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:19:23 localhost python3.9[198869]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:19:23 localhost sshd[198870]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:19:23 localhost python3.9[198981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21090 DF PROTO=TCP SPT=38838 DPT=9882 SEQ=64419992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59927EA80000000001030307) Feb 20 04:19:25 localhost python3.9[199091]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:19:25 localhost python3.9[199201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:19:26 localhost python3.9[199311]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:19:27 localhost python3.9[199419]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:19:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52280 DF PROTO=TCP SPT=58718 DPT=9105 SEQ=2224049347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599289680000000001030307) Feb 20 04:19:28 localhost python3.9[199529]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:29 localhost python3.9[199619]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579167.768335-1665-237756172481434/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52281 DF PROTO=TCP SPT=58718 DPT=9105 SEQ=2224049347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599291680000000001030307) Feb 20 04:19:29 localhost python3.9[199729]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:29 localhost sshd[199731]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:19:30 localhost python3.9[199821]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579169.3523057-1665-91867724010230/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:30 localhost python3.9[199931]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:31 localhost python3.9[200021]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579170.5336275-1665-53117978826558/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:32 localhost python3.9[200131]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31659 DF PROTO=TCP SPT=34970 DPT=9100 SEQ=4001655195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59929D680000000001030307) Feb 20 04:19:32 localhost python3.9[200221]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579171.7030823-1665-151640048376763/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:33 localhost python3.9[200331]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:33 localhost python3.9[200421]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579172.8879738-1665-4352235216342/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:34 localhost python3.9[200531]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:35 localhost python3.9[200621]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579174.0234854-1665-64128306467384/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:36 localhost python3.9[200731]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37788 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=241455690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992AB750000000001030307) Feb 20 04:19:36 localhost python3.9[200819]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579175.626153-1665-129836363519062/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:37 localhost python3.9[200929]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:19:38 localhost podman[201019]: 2026-02-20 09:19:38.445572479 +0000 UTC m=+0.082943072 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:19:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:19:38 localhost podman[201019]: 2026-02-20 09:19:38.48001336 +0000 UTC m=+0.117383923 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:19:38 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:19:38 localhost systemd[1]: tmp-crun.lvvK0A.mount: Deactivated successfully. Feb 20 04:19:38 localhost python3.9[201020]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579177.4699535-1665-202993156818799/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:38 localhost podman[201038]: 2026-02-20 09:19:38.561698391 +0000 UTC m=+0.088104104 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:19:38 localhost podman[201038]: 2026-02-20 09:19:38.622200218 +0000 UTC m=+0.148605941 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:19:38 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37790 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=241455690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992B7680000000001030307) Feb 20 04:19:39 localhost python3.9[201206]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:40 localhost python3.9[201348]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:41 localhost python3.9[201458]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:41 localhost python3.9[201586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52283 DF PROTO=TCP SPT=58718 DPT=9105 SEQ=2224049347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992C1680000000001030307) Feb 20 04:19:42 localhost python3.9[201696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:43 localhost python3.9[201806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:43 localhost python3.9[201916]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:44 localhost python3.9[202026]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:45 localhost python3.9[202136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:45 localhost python3.9[202246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11267 DF PROTO=TCP SPT=44804 DPT=9102 SEQ=4043637556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992D1680000000001030307) Feb 20 04:19:46 localhost python3.9[202356]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:19:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdo Feb 20 04:19:47 localhost python3.9[202466]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40631 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992D7FA0000000001030307) Feb 20 04:19:47 localhost python3.9[202576]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:49 localhost python3.9[202686]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:49 localhost python3.9[202796]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:50 localhost python3.9[202906]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40633 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992E3E90000000001030307) Feb 20 04:19:51 localhost python3.9[202994]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579190.0395901-2327-144118274078383/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:19:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.013 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Feb 20 04:19:51 localhost python3.9[203104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:52 localhost python3.9[203192]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579191.184508-2327-104139547922568/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:52 localhost python3.9[203302]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:53 localhost python3.9[203390]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579192.4147668-2327-214635171512767/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:53 localhost python3.9[203500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:54 localhost python3.9[203588]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579193.5151584-2327-124124188223815/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40634 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992F3A90000000001030307) Feb 20 04:19:55 localhost python3.9[203698]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:55 localhost python3.9[203786]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579194.719194-2327-269056524107679/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:56 localhost python3.9[203896]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:57 localhost python3.9[203984]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579196.0122721-2327-178922098812536/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41461 DF PROTO=TCP SPT=60506 DPT=9105 SEQ=3569357637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992FEA80000000001030307) Feb 20 04:19:57 localhost python3.9[204094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:19:58 localhost python3.9[204182]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579197.205364-2327-48964405654693/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:19:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41462 DF PROTO=TCP SPT=60506 DPT=9105 SEQ=3569357637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599306A80000000001030307) Feb 20 04:19:59 localhost python3.9[204292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:00 localhost python3.9[204380]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579199.146064-2327-196876657251657/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:01 localhost python3.9[204490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:01 localhost python3.9[204578]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579200.3764126-2327-233536110770031/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:02 localhost python3.9[204688]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40635 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599313680000000001030307) Feb 20 04:20:03 localhost python3.9[204776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579201.978815-2327-84156838051085/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:03 localhost python3.9[204886]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:04 localhost python3.9[204974]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579203.229483-2327-71804683373422/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:04 localhost python3.9[205084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:05 localhost python3.9[205172]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579204.3953495-2327-219570583992080/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:20:05.976 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:20:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:20:05.977 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:20:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:20:05.978 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:20:06 localhost python3.9[205282]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:06 localhost sshd[205283]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:20:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1951 DF PROTO=TCP SPT=38332 DPT=9101 SEQ=3436120266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599320A40000000001030307) Feb 20 04:20:06 localhost python3.9[205372]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579205.619173-2327-280455899276421/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:07 localhost python3.9[205482]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:07 localhost python3.9[205570]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579206.859325-2327-268408291450871/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:08 localhost python3.9[205678]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:20:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:20:09 localhost podman[205737]: 2026-02-20 09:20:09.165045995 +0000 UTC m=+0.088478977 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:20:09 localhost podman[205738]: 2026-02-20 09:20:09.246683855 +0000 UTC m=+0.169797266 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Feb 20 04:20:09 localhost podman[205738]: 2026-02-20 09:20:09.254340456 +0000 UTC m=+0.177453827 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Feb 20 04:20:09 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:20:09 localhost podman[205737]: 2026-02-20 09:20:09.279196325 +0000 UTC m=+0.202629297 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Feb 20 04:20:09 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1953 DF PROTO=TCP SPT=38332 DPT=9101 SEQ=3436120266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59932CA80000000001030307) Feb 20 04:20:09 localhost python3.9[205835]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Feb 20 04:20:11 localhost python3.9[205945]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:20:11 localhost systemd[1]: Reloading. Feb 20 04:20:11 localhost systemd-rc-local-generator[205967]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:11 localhost systemd-sysv-generator[205975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:11 localhost systemd[1]: Starting libvirt logging daemon socket... Feb 20 04:20:11 localhost systemd[1]: Listening on libvirt logging daemon socket. Feb 20 04:20:11 localhost systemd[1]: Starting libvirt logging daemon admin socket... Feb 20 04:20:11 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Feb 20 04:20:11 localhost systemd[1]: Starting libvirt logging daemon... Feb 20 04:20:11 localhost systemd[1]: Started libvirt logging daemon. Feb 20 04:20:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41464 DF PROTO=TCP SPT=60506 DPT=9105 SEQ=3569357637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599337680000000001030307) Feb 20 04:20:12 localhost python3.9[206096]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:20:12 localhost systemd[1]: Reloading. Feb 20 04:20:12 localhost systemd-rc-local-generator[206119]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:12 localhost systemd-sysv-generator[206127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:12 localhost systemd[1]: Starting libvirt nodedev daemon socket... Feb 20 04:20:12 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Feb 20 04:20:12 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Feb 20 04:20:12 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Feb 20 04:20:12 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Feb 20 04:20:12 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Feb 20 04:20:12 localhost systemd[1]: Started libvirt nodedev daemon. Feb 20 04:20:13 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 20 04:20:13 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 20 04:20:13 localhost setroubleshoot[206261]: Deleting alert 4c153363-0b75-4da9-9673-ecc521f0261c, it is allowed in current policy Feb 20 04:20:13 localhost python3.9[206272]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:20:13 localhost systemd[1]: Reloading. Feb 20 04:20:14 localhost systemd-rc-local-generator[206299]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:14 localhost systemd-sysv-generator[206303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:14 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Feb 20 04:20:14 localhost systemd[1]: Starting libvirt proxy daemon socket... Feb 20 04:20:14 localhost systemd[1]: Listening on libvirt proxy daemon socket. Feb 20 04:20:14 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Feb 20 04:20:14 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Feb 20 04:20:14 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Feb 20 04:20:14 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Feb 20 04:20:14 localhost systemd[1]: Started libvirt proxy daemon. Feb 20 04:20:15 localhost python3.9[206451]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:20:15 localhost systemd[1]: Reloading. Feb 20 04:20:15 localhost systemd-rc-local-generator[206476]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:15 localhost systemd-sysv-generator[206479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:15 localhost setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f0e143a7-e3ff-424e-aeeb-bf5f3a6516a4 Feb 20 04:20:15 localhost setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Feb 20 04:20:15 localhost setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f0e143a7-e3ff-424e-aeeb-bf5f3a6516a4 Feb 20 04:20:15 localhost setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:15 localhost systemd[1]: Listening on libvirt locking daemon socket. Feb 20 04:20:15 localhost systemd[1]: Starting libvirt QEMU daemon socket... Feb 20 04:20:15 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Feb 20 04:20:15 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Feb 20 04:20:15 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Feb 20 04:20:15 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Feb 20 04:20:15 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Feb 20 04:20:15 localhost systemd[1]: Started libvirt QEMU daemon. Feb 20 04:20:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13932 DF PROTO=TCP SPT=52974 DPT=9100 SEQ=729795198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599347680000000001030307) Feb 20 04:20:16 localhost python3.9[206634]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:20:16 localhost systemd[1]: Reloading. Feb 20 04:20:16 localhost systemd-rc-local-generator[206668]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:16 localhost systemd-sysv-generator[206672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:16 localhost systemd[1]: Starting libvirt secret daemon socket... Feb 20 04:20:16 localhost systemd[1]: Listening on libvirt secret daemon socket. Feb 20 04:20:16 localhost systemd[1]: Starting libvirt secret daemon admin socket... Feb 20 04:20:16 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Feb 20 04:20:16 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Feb 20 04:20:16 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Feb 20 04:20:16 localhost systemd[1]: Started libvirt secret daemon. Feb 20 04:20:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14391 DF PROTO=TCP SPT=35116 DPT=9882 SEQ=342328268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59934D2A0000000001030307) Feb 20 04:20:17 localhost python3.9[206817]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:18 localhost python3.9[206927]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:20:19 localhost python3.9[207037]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:20 localhost python3.9[207149]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:20:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14393 DF PROTO=TCP SPT=35116 DPT=9882 SEQ=342328268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599359460000000001030307) Feb 20 04:20:21 localhost python3.9[207257]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:22 localhost python3.9[207343]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579220.8484962-3191-14855100117293/.source.xml follow=False _original_basename=secret.xml.j2 checksum=e299a5f369c62c832b857708260504de70ea24e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:22 localhost python3.9[207453]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine a8557ee9-b55d-5519-942c-cf8f6172f1d8#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:24 localhost python3.9[207573]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14394 DF PROTO=TCP SPT=35116 DPT=9882 SEQ=342328268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599368E80000000001030307) Feb 20 04:20:25 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Feb 20 04:20:25 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 20 04:20:26 localhost python3.9[207910]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:27 localhost python3.9[208020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63478 DF PROTO=TCP SPT=56584 DPT=9105 SEQ=1510305160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599373E80000000001030307) Feb 20 04:20:27 localhost python3.9[208108]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579226.7791867-3357-272241796040600/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:28 localhost python3.9[208218]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:29 localhost python3.9[208328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63479 DF PROTO=TCP SPT=56584 DPT=9105 SEQ=1510305160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59937BE90000000001030307) Feb 20 04:20:30 localhost python3.9[208385]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:30 localhost python3.9[208495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:31 localhost python3.9[208552]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rs0pdz88 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:31 localhost python3.9[208662]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13933 DF PROTO=TCP SPT=52974 DPT=9100 SEQ=729795198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599387680000000001030307) Feb 20 04:20:33 localhost python3.9[208719]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:33 localhost python3.9[208829]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:35 localhost python3[208940]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 20 04:20:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20846 DF PROTO=TCP SPT=46028 DPT=9101 SEQ=1052269825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599395D70000000001030307) Feb 20 04:20:37 localhost python3.9[209050]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:38 localhost python3.9[209107]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20848 DF PROTO=TCP SPT=46028 DPT=9101 SEQ=1052269825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993A1E80000000001030307) Feb 20 04:20:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:20:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:20:39 localhost systemd[1]: tmp-crun.eMPfXX.mount: Deactivated successfully. Feb 20 04:20:39 localhost podman[209219]: 2026-02-20 09:20:39.67276003 +0000 UTC m=+0.098991760 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:20:39 localhost podman[209219]: 2026-02-20 09:20:39.679826189 +0000 UTC m=+0.106057919 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:20:39 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:20:39 localhost python3.9[209217]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:39 localhost podman[209218]: 2026-02-20 09:20:39.76616624 +0000 UTC m=+0.192421861 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:20:39 localhost podman[209218]: 2026-02-20 09:20:39.798955369 +0000 UTC m=+0.225210990 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:20:39 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:20:40 localhost python3.9[209349]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579238.9304204-3624-174519557802430/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:41 localhost python3.9[209459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:41 localhost python3.9[209552]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63481 DF PROTO=TCP SPT=56584 DPT=9105 SEQ=1510305160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993AB690000000001030307) Feb 20 04:20:42 localhost python3.9[209693]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:42 localhost python3.9[209766]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:44 localhost python3.9[209878]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:44 localhost python3.9[209968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579243.0479882-3741-33841473204825/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:45 localhost python3.9[210078]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:45 localhost sshd[210079]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:20:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30004 DF PROTO=TCP SPT=38372 DPT=9102 SEQ=2768683098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993BB680000000001030307) Feb 20 04:20:47 localhost python3.9[210190]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53682 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993C25A0000000001030307) Feb 20 04:20:47 localhost sshd[210304]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:20:47 localhost python3.9[210303]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:48 localhost sshd[210323]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:20:48 localhost python3.9[210417]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:49 localhost python3.9[210528]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:20:50 localhost python3.9[210640]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:20:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53684 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993CE680000000001030307) Feb 20 04:20:51 localhost python3.9[210753]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:51 localhost python3.9[210863]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:52 localhost python3.9[210951]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579251.2760997-3957-162186007319157/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:52 localhost python3.9[211061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:53 localhost python3.9[211149]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579252.4943192-4002-245225278398161/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:54 localhost python3.9[211259]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:20:54 localhost python3.9[211347]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579253.7050261-4048-69547586338565/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53685 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993DE280000000001030307) Feb 20 04:20:55 localhost python3.9[211457]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:20:55 localhost systemd[1]: Reloading. Feb 20 04:20:55 localhost systemd-rc-local-generator[211481]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:55 localhost systemd-sysv-generator[211486]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:55 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:56 localhost systemd[1]: Reached target edpm_libvirt.target. Feb 20 04:20:57 localhost python3.9[211607]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 20 04:20:57 localhost systemd[1]: Reloading. Feb 20 04:20:57 localhost systemd-rc-local-generator[211636]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:57 localhost systemd-sysv-generator[211639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47117 DF PROTO=TCP SPT=58588 DPT=9105 SEQ=2990462913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993E9280000000001030307) Feb 20 04:20:57 localhost systemd[1]: Reloading. Feb 20 04:20:57 localhost systemd-rc-local-generator[211672]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:20:57 localhost systemd-sysv-generator[211676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:20:58 localhost systemd[1]: session-53.scope: Deactivated successfully. Feb 20 04:20:58 localhost systemd[1]: session-53.scope: Consumed 3min 23.876s CPU time. Feb 20 04:20:58 localhost systemd-logind[759]: Session 53 logged out. Waiting for processes to exit. Feb 20 04:20:58 localhost systemd-logind[759]: Removed session 53. Feb 20 04:20:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47118 DF PROTO=TCP SPT=58588 DPT=9105 SEQ=2990462913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993F1290000000001030307) Feb 20 04:21:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53686 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993FF680000000001030307) Feb 20 04:21:03 localhost sshd[211700]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:21:03 localhost systemd-logind[759]: New session 54 of user zuul. Feb 20 04:21:03 localhost systemd[1]: Started Session 54 of User zuul. Feb 20 04:21:04 localhost python3.9[211811]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:21:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:21:05.977 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:21:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:21:05.978 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:21:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:21:05.980 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:21:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64752 DF PROTO=TCP SPT=50190 DPT=9101 SEQ=781888736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59940B050000000001030307) Feb 20 04:21:06 localhost python3.9[211923]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:21:06 localhost network[211940]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:21:06 localhost network[211941]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:21:06 localhost network[211942]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:21:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64754 DF PROTO=TCP SPT=50190 DPT=9101 SEQ=781888736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599417280000000001030307) Feb 20 04:21:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:21:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:21:09 localhost podman[212058]: 2026-02-20 09:21:09.855131074 +0000 UTC m=+0.106551355 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent) Feb 20 04:21:09 localhost podman[212058]: 2026-02-20 09:21:09.891089896 +0000 UTC m=+0.142510177 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Feb 20 04:21:09 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:21:09 localhost podman[212078]: 2026-02-20 09:21:09.95707135 +0000 UTC m=+0.091920143 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:21:10 localhost podman[212078]: 2026-02-20 09:21:10.020669315 +0000 UTC m=+0.155518118 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:21:10 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:21:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47120 DF PROTO=TCP SPT=58588 DPT=9105 SEQ=2990462913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599421680000000001030307) Feb 20 04:21:13 localhost python3.9[212215]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:21:14 localhost python3.9[212278]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:21:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28353 DF PROTO=TCP SPT=40082 DPT=9102 SEQ=2778869450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599431680000000001030307) Feb 20 04:21:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52001 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994378A0000000001030307) Feb 20 04:21:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52003 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599443A80000000001030307) Feb 20 04:21:22 localhost python3.9[212390]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:21:23 localhost python3.9[212502]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:24 localhost python3.9[212612]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52004 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599453680000000001030307) Feb 20 04:21:24 localhost python3.9[212723]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:21:25 localhost python3.9[212834]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:21:26 localhost python3.9[212945]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:21:27 localhost python3.9[213057]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36335 DF PROTO=TCP SPT=50972 DPT=9105 SEQ=4082688825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59945E280000000001030307) Feb 20 04:21:27 localhost sshd[213119]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:21:28 localhost python3.9[213169]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:21:29 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Feb 20 04:21:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36336 DF PROTO=TCP SPT=50972 DPT=9105 SEQ=4082688825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599466280000000001030307) Feb 20 04:21:31 localhost python3.9[213283]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:21:31 localhost systemd[1]: Reloading. Feb 20 04:21:31 localhost sshd[213287]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:21:31 localhost systemd-rc-local-generator[213315]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:21:31 localhost systemd-sysv-generator[213318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:31 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Feb 20 04:21:31 localhost systemd[1]: Starting Open-iSCSI... Feb 20 04:21:31 localhost iscsid[213326]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 20 04:21:31 localhost iscsid[213326]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 20 04:21:31 localhost iscsid[213326]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 20 04:21:31 localhost iscsid[213326]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 20 04:21:31 localhost iscsid[213326]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 20 04:21:31 localhost iscsid[213326]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 20 04:21:31 localhost iscsid[213326]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Feb 20 04:21:31 localhost systemd[1]: Started Open-iSCSI. Feb 20 04:21:31 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Feb 20 04:21:31 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Feb 20 04:21:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52005 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599473690000000001030307) Feb 20 04:21:33 localhost python3.9[213437]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:21:33 localhost network[213454]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:21:33 localhost network[213455]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:21:33 localhost network[213456]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:21:33 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 20 04:21:34 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 20 04:21:34 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Feb 20 04:21:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480 Feb 20 04:21:35 localhost setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 20 04:21:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22408 DF PROTO=TCP SPT=57456 DPT=9101 SEQ=2747713421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599480340000000001030307) Feb 20 04:21:38 localhost python3.9[213705]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22410 DF PROTO=TCP SPT=57456 DPT=9101 SEQ=2747713421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59948C290000000001030307) Feb 20 04:21:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:21:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:21:40 localhost podman[213709]: 2026-02-20 09:21:40.157680992 +0000 UTC m=+0.090504504 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:21:40 localhost podman[213709]: 2026-02-20 09:21:40.18784259 +0000 UTC m=+0.120666112 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:21:40 localhost systemd[1]: tmp-crun.QaNOQz.mount: Deactivated successfully. Feb 20 04:21:40 localhost podman[213708]: 2026-02-20 09:21:40.202854846 +0000 UTC m=+0.135250995 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Feb 20 04:21:40 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:21:40 localhost podman[213708]: 2026-02-20 09:21:40.306448967 +0000 UTC m=+0.238845106 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:21:40 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:21:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36338 DF PROTO=TCP SPT=50972 DPT=9105 SEQ=4082688825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599495680000000001030307) Feb 20 04:21:42 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:21:42 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 04:21:42 localhost systemd[1]: Reloading. Feb 20 04:21:42 localhost systemd-sysv-generator[213790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:21:42 localhost systemd-rc-local-generator[213786]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:21:42 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 04:21:42 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:21:42 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 04:21:42 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 04:21:42 localhost systemd[1]: run-rc5ef00701b6d49138398471c696372cc.service: Deactivated successfully. Feb 20 04:21:42 localhost systemd[1]: run-re412e902847f4578b27d977c8b3bcfb2.service: Deactivated successfully. Feb 20 04:21:44 localhost python3.9[214105]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 20 04:21:45 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Feb 20 04:21:45 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 20 04:21:45 localhost python3.9[214215]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Feb 20 04:21:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17525 DF PROTO=TCP SPT=52864 DPT=9102 SEQ=2001544759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994A5680000000001030307) Feb 20 04:21:46 localhost python3.9[214329]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:21:46 localhost python3.9[214435]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579305.7922926-486-185059126350890/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52973 DF PROTO=TCP SPT=40052 DPT=9882 SEQ=2359454866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994ACBA0000000001030307) Feb 20 04:21:47 localhost python3.9[214545]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:49 localhost python3.9[214655]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:21:49 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 20 04:21:49 localhost systemd[1]: Stopped Load Kernel Modules. Feb 20 04:21:49 localhost systemd[1]: Stopping Load Kernel Modules... Feb 20 04:21:49 localhost systemd[1]: Starting Load Kernel Modules... Feb 20 04:21:49 localhost systemd-modules-load[214659]: Module 'msr' is built in Feb 20 04:21:49 localhost systemd[1]: Finished Load Kernel Modules. Feb 20 04:21:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52975 DF PROTO=TCP SPT=40052 DPT=9882 SEQ=2359454866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994B8A90000000001030307) Feb 20 04:21:51 localhost python3.9[214770]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:21:52 localhost python3.9[214881]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:21:52 localhost python3.9[214991]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:21:53 localhost python3.9[215079]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579312.475965-638-18158503919149/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:54 localhost python3.9[215189]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52976 DF PROTO=TCP SPT=40052 DPT=9882 SEQ=2359454866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994C8680000000001030307) Feb 20 04:21:54 localhost python3.9[215300]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:55 localhost python3.9[215410]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:55 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 20 04:21:55 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:21:55 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:21:55 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:21:56 localhost python3.9[215521]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:57 localhost python3.9[215631]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19709 DF PROTO=TCP SPT=51054 DPT=9105 SEQ=3258922464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994D3690000000001030307) Feb 20 04:21:58 localhost python3.9[215741]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:59 localhost python3.9[215851]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:59 localhost python3.9[215961]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:21:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19710 DF PROTO=TCP SPT=51054 DPT=9105 SEQ=3258922464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994DB680000000001030307) Feb 20 04:22:00 localhost python3.9[216071]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:22:01 localhost python3.9[216183]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:02 localhost python3.9[216294]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:02 localhost systemd[1]: Listening on multipathd control socket. Feb 20 04:22:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52604 DF PROTO=TCP SPT=60274 DPT=9100 SEQ=446721079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994E7680000000001030307) Feb 20 04:22:04 localhost python3.9[216408]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:05 localhost systemd[1]: Starting Wait for udev To Complete Device Initialization... Feb 20 04:22:05 localhost udevadm[216413]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in. Feb 20 04:22:05 localhost systemd[1]: Finished Wait for udev To Complete Device Initialization. Feb 20 04:22:05 localhost systemd[1]: Starting Device-Mapper Multipath Device Controller... Feb 20 04:22:05 localhost multipathd[216416]: --------start up-------- Feb 20 04:22:05 localhost multipathd[216416]: read /etc/multipath.conf Feb 20 04:22:05 localhost multipathd[216416]: path checkers start up Feb 20 04:22:05 localhost systemd[1]: Started Device-Mapper Multipath Device Controller. Feb 20 04:22:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:22:05.978 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:22:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:22:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:22:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:22:05.982 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:22:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20791 DF PROTO=TCP SPT=37032 DPT=9101 SEQ=76710761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994F5650000000001030307) Feb 20 04:22:06 localhost python3.9[216534]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 20 04:22:07 localhost python3.9[216644]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Feb 20 04:22:07 localhost sshd[216645]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:22:08 localhost python3.9[216764]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:22:08 localhost sshd[216781]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:22:08 localhost python3.9[216854]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579327.896345-1028-43114478183607/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20793 DF PROTO=TCP SPT=37032 DPT=9101 SEQ=76710761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599501680000000001030307) Feb 20 04:22:10 localhost python3.9[216964]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:22:10 localhost sshd[217092]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:22:10 localhost podman[217076]: 2026-02-20 09:22:10.840774925 +0000 UTC m=+0.085102096 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent) Feb 20 04:22:10 localhost systemd[1]: tmp-crun.FZp9jc.mount: Deactivated successfully. Feb 20 04:22:10 localhost podman[217075]: 2026-02-20 09:22:10.914111015 +0000 UTC m=+0.158971993 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:22:10 localhost podman[217076]: 2026-02-20 09:22:10.923247859 +0000 UTC m=+0.167575020 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:22:10 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:22:10 localhost podman[217075]: 2026-02-20 09:22:10.950510927 +0000 UTC m=+0.195371895 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible) Feb 20 04:22:10 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:22:11 localhost python3.9[217074]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:22:11 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 20 04:22:11 localhost systemd[1]: Stopped Load Kernel Modules. Feb 20 04:22:11 localhost systemd[1]: Stopping Load Kernel Modules... Feb 20 04:22:11 localhost systemd[1]: Starting Load Kernel Modules... Feb 20 04:22:11 localhost systemd-modules-load[217122]: Module 'msr' is built in Feb 20 04:22:11 localhost systemd[1]: Finished Load Kernel Modules. Feb 20 04:22:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19712 DF PROTO=TCP SPT=51054 DPT=9105 SEQ=3258922464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59950B680000000001030307) Feb 20 04:22:12 localhost python3.9[217232]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:22:12 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Feb 20 04:22:14 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Feb 20 04:22:15 localhost sshd[217237]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:22:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36937 DF PROTO=TCP SPT=48260 DPT=9102 SEQ=1628961010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59951B680000000001030307) Feb 20 04:22:16 localhost systemd[1]: Reloading. Feb 20 04:22:16 localhost systemd-sysv-generator[217275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:22:16 localhost systemd-rc-local-generator[217272]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: Reloading. Feb 20 04:22:16 localhost systemd-rc-local-generator[217309]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:22:16 localhost systemd-sysv-generator[217312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:16 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button) Feb 20 04:22:17 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Feb 20 04:22:17 localhost lvm[217359]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 20 04:22:17 localhost lvm[217359]: VG ceph_vg1 finished Feb 20 04:22:17 localhost lvm[217358]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 20 04:22:17 localhost lvm[217358]: VG ceph_vg0 finished Feb 20 04:22:17 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 20 04:22:17 localhost systemd[1]: Starting man-db-cache-update.service... Feb 20 04:22:17 localhost systemd[1]: Reloading. Feb 20 04:22:17 localhost systemd-sysv-generator[217410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:22:17 localhost systemd-rc-local-generator[217407]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:17 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 20 04:22:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34958 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599521EB0000000001030307) Feb 20 04:22:18 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 20 04:22:18 localhost systemd[1]: Finished man-db-cache-update.service. Feb 20 04:22:18 localhost systemd[1]: man-db-cache-update.service: Consumed 1.237s CPU time. Feb 20 04:22:18 localhost systemd[1]: run-rf52af6cdffbd4b16a79eace143fb076e.service: Deactivated successfully. Feb 20 04:22:19 localhost python3.9[218666]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:22:19 localhost systemd[1]: Stopping Device-Mapper Multipath Device Controller... Feb 20 04:22:19 localhost multipathd[216416]: exit (signal) Feb 20 04:22:19 localhost multipathd[216416]: --------shut down------- Feb 20 04:22:19 localhost systemd[1]: multipathd.service: Deactivated successfully. Feb 20 04:22:19 localhost systemd[1]: Stopped Device-Mapper Multipath Device Controller. Feb 20 04:22:19 localhost systemd[1]: Starting Device-Mapper Multipath Device Controller... Feb 20 04:22:19 localhost multipathd[218672]: --------start up-------- Feb 20 04:22:19 localhost multipathd[218672]: read /etc/multipath.conf Feb 20 04:22:19 localhost multipathd[218672]: path checkers start up Feb 20 04:22:19 localhost systemd[1]: Started Device-Mapper Multipath Device Controller. Feb 20 04:22:20 localhost python3.9[218788]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:22:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34960 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59952DE90000000001030307) Feb 20 04:22:21 localhost python3.9[218902]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:23 localhost python3.9[219012]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:22:23 localhost systemd[1]: Reloading. Feb 20 04:22:23 localhost systemd-sysv-generator[219043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:22:23 localhost systemd-rc-local-generator[219039]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:23 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:24 localhost python3.9[219156]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:22:24 localhost network[219173]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:22:24 localhost network[219174]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:22:24 localhost network[219175]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:22:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34961 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59953DA80000000001030307) Feb 20 04:22:26 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 20 04:22:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:22:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42734 DF PROTO=TCP SPT=58974 DPT=9105 SEQ=2211298961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599548A80000000001030307) Feb 20 04:22:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42735 DF PROTO=TCP SPT=58974 DPT=9105 SEQ=2211298961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599550A80000000001030307) Feb 20 04:22:30 localhost python3.9[219409]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:31 localhost python3.9[219520]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34962 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59955D690000000001030307) Feb 20 04:22:33 localhost python3.9[219631]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:34 localhost python3.9[219742]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:35 localhost python3.9[219853]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5836 DF PROTO=TCP SPT=40392 DPT=9101 SEQ=4145350747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59956A950000000001030307) Feb 20 04:22:37 localhost python3.9[219964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:37 localhost python3.9[220075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5838 DF PROTO=TCP SPT=40392 DPT=9101 SEQ=4145350747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599576A80000000001030307) Feb 20 04:22:39 localhost python3.9[220186]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:22:40 localhost python3.9[220297]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:22:41 localhost podman[220381]: 2026-02-20 09:22:41.170912639 +0000 UTC m=+0.102825551 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:22:41 localhost podman[220381]: 2026-02-20 09:22:41.201547264 +0000 UTC m=+0.133460216 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 20 04:22:41 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:22:41 localhost podman[220376]: 2026-02-20 09:22:41.258857616 +0000 UTC m=+0.190848641 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:22:41 localhost podman[220376]: 2026-02-20 09:22:41.302113642 +0000 UTC m=+0.234104667 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2) Feb 20 04:22:41 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:22:41 localhost python3.9[220435]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:42 localhost python3.9[220557]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42737 DF PROTO=TCP SPT=58974 DPT=9105 SEQ=2211298961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599581680000000001030307) Feb 20 04:22:42 localhost python3.9[220667]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:43 localhost python3.9[220777]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:43 localhost python3.9[220887]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:45 localhost python3.9[220997]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:45 localhost python3.9[221107]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39922 DF PROTO=TCP SPT=44428 DPT=9100 SEQ=3565515427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599591680000000001030307) Feb 20 04:22:47 localhost python3.9[221253]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:47 localhost python3.9[221419]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55861 DF PROTO=TCP SPT=46612 DPT=9882 SEQ=270082323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995971A0000000001030307) Feb 20 04:22:48 localhost python3.9[221561]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:48 localhost python3.9[221689]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:49 localhost python3.9[221799]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:50 localhost python3.9[221909]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:50 localhost python3.9[222019]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55863 DF PROTO=TCP SPT=46612 DPT=9882 SEQ=270082323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995A3280000000001030307) Feb 20 04:22:51 localhost python3.9[222129]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:22:52 localhost python3.9[222239]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:52 localhost python3.9[222349]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:22:53 localhost python3.9[222459]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:22:53 localhost systemd[1]: Reloading. Feb 20 04:22:53 localhost systemd-rc-local-generator[222484]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:22:53 localhost systemd-sysv-generator[222489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:22:53 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:53 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:53 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:53 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55864 DF PROTO=TCP SPT=46612 DPT=9882 SEQ=270082323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995B2E80000000001030307) Feb 20 04:22:54 localhost python3.9[222605]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:55 localhost python3.9[222716]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:56 localhost python3.9[222827]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:57 localhost python3.9[222938]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16936 DF PROTO=TCP SPT=35494 DPT=9105 SEQ=768474994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995BDE80000000001030307) Feb 20 04:22:58 localhost python3.9[223049]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:59 localhost python3.9[223160]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:22:59 localhost sshd[223162]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:22:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16937 DF PROTO=TCP SPT=35494 DPT=9105 SEQ=768474994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995C5E90000000001030307) Feb 20 04:23:00 localhost python3.9[223273]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:23:01 localhost python3.9[223384]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:23:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39923 DF PROTO=TCP SPT=44428 DPT=9100 SEQ=3565515427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995D1680000000001030307) Feb 20 04:23:03 localhost python3.9[223495]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:03 localhost python3.9[223605]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:04 localhost python3.9[223715]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:04 localhost python3.9[223825]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:23:05.979 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:23:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:23:05.980 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:23:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:23:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:23:06 localhost python3.9[223935]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41542 DF PROTO=TCP SPT=57246 DPT=9101 SEQ=3677019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995DFC50000000001030307) Feb 20 04:23:06 localhost python3.9[224045]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:07 localhost python3.9[224155]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:08 localhost python3.9[224265]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:09 localhost python3.9[224375]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41544 DF PROTO=TCP SPT=57246 DPT=9101 SEQ=3677019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995EBE80000000001030307) Feb 20 04:23:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16939 DF PROTO=TCP SPT=35494 DPT=9105 SEQ=768474994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995F5680000000001030307) Feb 20 04:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:23:12 localhost podman[224393]: 2026-02-20 09:23:12.142748172 +0000 UTC m=+0.079674395 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:23:12 localhost podman[224393]: 2026-02-20 09:23:12.186321767 +0000 UTC m=+0.123247930 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:23:12 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:23:12 localhost podman[224394]: 2026-02-20 09:23:12.200164178 +0000 UTC m=+0.136694839 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:23:12 localhost podman[224394]: 2026-02-20 09:23:12.236000248 +0000 UTC m=+0.172530929 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent) Feb 20 04:23:12 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:23:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29675 DF PROTO=TCP SPT=47610 DPT=9100 SEQ=1067107032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599605680000000001030307) Feb 20 04:23:16 localhost python3.9[224528]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Feb 20 04:23:17 localhost python3.9[224639]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 20 04:23:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31335 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59960C4B0000000001030307) Feb 20 04:23:18 localhost python3.9[224755]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Feb 20 04:23:20 localhost sshd[224781]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:23:20 localhost systemd-logind[759]: New session 55 of user zuul. Feb 20 04:23:20 localhost systemd[1]: Started Session 55 of User zuul. Feb 20 04:23:20 localhost systemd[1]: session-55.scope: Deactivated successfully. Feb 20 04:23:20 localhost systemd-logind[759]: Session 55 logged out. Waiting for processes to exit. Feb 20 04:23:20 localhost systemd-logind[759]: Removed session 55. Feb 20 04:23:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31337 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599618680000000001030307) Feb 20 04:23:21 localhost python3.9[224892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:22 localhost python3.9[224947]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:22 localhost python3.9[225055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:23 localhost python3.9[225141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579402.3160772-2631-240597552342396/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:23 localhost python3.9[225249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:24 localhost python3.9[225335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579403.4708257-2631-187110727356389/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31338 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599628280000000001030307) Feb 20 04:23:25 localhost python3.9[225443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:25 localhost python3.9[225529]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579404.6495278-2631-222227080497075/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:26 localhost python3.9[225637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:26 localhost python3.9[225723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579405.7649143-2793-177327728494929/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=50598ea057afd85a1f5b995974d61e2c257c9737 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64576 DF PROTO=TCP SPT=53066 DPT=9105 SEQ=3853631590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599632E80000000001030307) Feb 20 04:23:27 localhost python3.9[225833]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:28 localhost python3.9[225943]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:29 localhost python3.9[226053]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:23:29 localhost auditd[725]: Audit daemon rotating log files Feb 20 04:23:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64577 DF PROTO=TCP SPT=53066 DPT=9105 SEQ=3853631590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59963AE80000000001030307) Feb 20 04:23:29 localhost python3.9[226165]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:30 localhost python3.9[226273]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:23:32 localhost python3.9[226385]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:32 localhost python3.9[226495]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31339 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599649680000000001030307) Feb 20 04:23:33 localhost python3.9[226603]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:36 localhost python3.9[226907]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False Feb 20 04:23:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34701 DF PROTO=TCP SPT=53984 DPT=9101 SEQ=1340782939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599654F50000000001030307) Feb 20 04:23:37 localhost python3.9[227017]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:23:38 localhost python3[227127]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:23:38 localhost podman[227163]: Feb 20 04:23:38 localhost podman[227163]: 2026-02-20 09:23:38.706682021 +0000 UTC m=+0.087103302 container create 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}) Feb 20 04:23:38 localhost podman[227163]: 2026-02-20 09:23:38.663201272 +0000 UTC m=+0.043622563 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 20 04:23:38 localhost python3[227127]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Feb 20 04:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34703 DF PROTO=TCP SPT=53984 DPT=9101 SEQ=1340782939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599660E80000000001030307) Feb 20 04:23:39 localhost python3.9[227311]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:23:40 localhost python3.9[227421]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:23:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64579 DF PROTO=TCP SPT=53066 DPT=9105 SEQ=3853631590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59966B680000000001030307) Feb 20 04:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:23:43 localhost systemd[1]: tmp-crun.LMNY1C.mount: Deactivated successfully. Feb 20 04:23:43 localhost podman[227532]: 2026-02-20 09:23:43.076895644 +0000 UTC m=+0.088759196 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:23:43 localhost podman[227532]: 2026-02-20 09:23:43.120057741 +0000 UTC m=+0.131921313 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 20 04:23:43 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:23:43 localhost podman[227533]: 2026-02-20 09:23:43.139767761 +0000 UTC m=+0.149420692 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:23:43 localhost podman[227533]: 2026-02-20 09:23:43.14660828 +0000 UTC m=+0.156261241 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 20 04:23:43 localhost python3.9[227531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:43 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:23:43 localhost python3.9[227665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579421.378471-3263-108440294367717/.source.yaml _original_basename=.nm_8zu1z follow=False checksum=201984e070e9869531933fce67c78d3ce61bb83b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:44 localhost systemd[1]: tmp-crun.Mtj8Xm.mount: Deactivated successfully. Feb 20 04:23:44 localhost sshd[227666]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:23:45 localhost python3.9[227777]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37443 DF PROTO=TCP SPT=49524 DPT=9102 SEQ=300155209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59967B680000000001030307) Feb 20 04:23:46 localhost python3.9[227887]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:23:47 localhost python3.9[227997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:23:47 localhost python3.9[228087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579426.597802-3363-22528452210213/.source.json _original_basename=.86e5bwca follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9076 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996817B0000000001030307) Feb 20 04:23:48 localhost python3.9[228195]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9078 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59968D680000000001030307) Feb 20 04:23:51 localhost python3.9[228585]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False Feb 20 04:23:52 localhost python3.9[228695]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:23:53 localhost python3[228805]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:23:53 localhost python3[228805]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",#012 "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:31:38.534497001Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1214548351,#012 "VirtualSize": 1214548351,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",#012 "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 20 04:23:53 localhost podman[228856]: 2026-02-20 09:23:53.971784448 +0000 UTC m=+0.092643819 container remove a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step5, vcs-type=git, build-date=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Feb 20 04:23:53 localhost python3[228805]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Feb 20 04:23:54 localhost podman[228870]: Feb 20 04:23:54 localhost podman[228870]: 2026-02-20 09:23:54.079750746 +0000 UTC m=+0.088285951 container create 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:23:54 localhost podman[228870]: 2026-02-20 09:23:54.035757611 +0000 UTC m=+0.044292816 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 20 04:23:54 localhost python3[228805]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Feb 20 04:23:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9079 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59969D280000000001030307) Feb 20 04:23:55 localhost python3.9[229018]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:23:56 localhost python3.9[229130]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:56 localhost python3.9[229185]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:23:57 localhost python3.9[229294]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579436.6840694-3596-258715125999464/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:23:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20917 DF PROTO=TCP SPT=36818 DPT=9105 SEQ=3333752055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996A8290000000001030307) Feb 20 04:23:57 localhost python3.9[229349]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:23:57 localhost systemd[1]: Reloading. Feb 20 04:23:57 localhost systemd-sysv-generator[229374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:23:57 localhost systemd-rc-local-generator[229371]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:58 localhost python3.9[229440]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:23:58 localhost systemd[1]: Reloading. Feb 20 04:23:58 localhost systemd-rc-local-generator[229465]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:23:58 localhost systemd-sysv-generator[229469]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:23:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:23:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:23:59 localhost systemd[1]: Starting nova_compute container... Feb 20 04:23:59 localhost systemd[1]: Started libcrun container. Feb 20 04:23:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 20 04:23:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 20 04:23:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 20 04:23:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 04:23:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 04:23:59 localhost podman[229481]: 2026-02-20 09:23:59.244000651 +0000 UTC m=+0.107736211 container init 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:23:59 localhost systemd[1]: tmp-crun.3iwntk.mount: Deactivated successfully. Feb 20 04:23:59 localhost podman[229481]: 2026-02-20 09:23:59.259574928 +0000 UTC m=+0.123310508 container start 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Feb 20 04:23:59 localhost podman[229481]: nova_compute Feb 20 04:23:59 localhost nova_compute[229496]: + sudo -E kolla_set_configs Feb 20 04:23:59 localhost systemd[1]: Started nova_compute container. Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Validating config file Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying service configuration files Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Deleting /etc/ceph Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Creating directory /etc/ceph Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Writing out command to execute Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:23:59 localhost nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:23:59 localhost nova_compute[229496]: ++ cat /run_command Feb 20 04:23:59 localhost nova_compute[229496]: + CMD=nova-compute Feb 20 04:23:59 localhost nova_compute[229496]: + ARGS= Feb 20 04:23:59 localhost nova_compute[229496]: + sudo kolla_copy_cacerts Feb 20 04:23:59 localhost nova_compute[229496]: + [[ ! -n '' ]] Feb 20 04:23:59 localhost nova_compute[229496]: + . kolla_extend_start Feb 20 04:23:59 localhost nova_compute[229496]: + echo 'Running command: '\''nova-compute'\''' Feb 20 04:23:59 localhost nova_compute[229496]: Running command: 'nova-compute' Feb 20 04:23:59 localhost nova_compute[229496]: + umask 0022 Feb 20 04:23:59 localhost nova_compute[229496]: + exec nova-compute Feb 20 04:23:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20918 DF PROTO=TCP SPT=36818 DPT=9105 SEQ=3333752055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996B0280000000001030307) Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.048 229500 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.048 229500 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.049 229500 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.049 229500 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.164 229500 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.185 229500 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.185 229500 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 20 04:24:01 localhost python3.9[229618]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.638 229500 INFO nova.virt.driver [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.760 229500 INFO nova.compute.provider_config [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.768 229500 WARNING nova.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.768 229500 DEBUG oslo_concurrency.lockutils [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.768 229500 DEBUG oslo_concurrency.lockutils [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_concurrency.lockutils [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console_host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.837 229500 WARNING oslo_config.cfg [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 20 04:24:01 localhost nova_compute[229496]: live_migration_uri is deprecated for removal in favor of two other options that Feb 20 04:24:01 localhost nova_compute[229496]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 20 04:24:01 localhost nova_compute[229496]: and ``live_migration_inbound_addr`` respectively. Feb 20 04:24:01 localhost nova_compute[229496]: ). Its value may be silently ignored in the future.#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_secret_uuid = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.898 229500 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.911 229500 INFO nova.virt.node [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.911 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.912 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.912 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.912 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.923 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.925 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.926 229500 INFO nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Connection event '1' reason 'None'#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.937 229500 DEBUG nova.virt.libvirt.volume.mount [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.946 229500 INFO nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host capabilities Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: f44a30b3-674b-4e65-a07d-fb3d71d4ae11 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: x86_64 Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome-v4 Feb 20 04:24:01 localhost nova_compute[229496]: AMD Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: tcp Feb 20 04:24:01 localhost nova_compute[229496]: rdma Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: 16116612 Feb 20 04:24:01 localhost nova_compute[229496]: 4029153 Feb 20 04:24:01 localhost nova_compute[229496]: 0 Feb 20 04:24:01 localhost nova_compute[229496]: 0 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: selinux Feb 20 04:24:01 localhost nova_compute[229496]: 0 Feb 20 04:24:01 localhost nova_compute[229496]: system_u:system_r:svirt_t:s0 Feb 20 04:24:01 localhost nova_compute[229496]: system_u:system_r:svirt_tcg_t:s0 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: dac Feb 20 04:24:01 localhost nova_compute[229496]: 0 Feb 20 04:24:01 localhost nova_compute[229496]: +107:+107 Feb 20 04:24:01 localhost nova_compute[229496]: +107:+107 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: hvm Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: 32 Feb 20 04:24:01 localhost nova_compute[229496]: /usr/libexec/qemu-kvm Feb 20 04:24:01 localhost nova_compute[229496]: pc-i440fx-rhel7.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.8.0 Feb 20 04:24:01 localhost nova_compute[229496]: q35 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.4.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.5.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.3.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel7.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.4.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.2.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.2.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.0.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.0.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.1.0 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: hvm Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: 64 Feb 20 04:24:01 localhost nova_compute[229496]: /usr/libexec/qemu-kvm Feb 20 04:24:01 localhost nova_compute[229496]: pc-i440fx-rhel7.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.8.0 Feb 20 04:24:01 localhost nova_compute[229496]: q35 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.4.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.5.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.3.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel7.6.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.4.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.2.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.2.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.0.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.0.0 Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel8.1.0 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: #033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.955 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 20 04:24:01 localhost nova_compute[229496]: 2026-02-20 09:24:01.976 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: /usr/libexec/qemu-kvm Feb 20 04:24:01 localhost nova_compute[229496]: kvm Feb 20 04:24:01 localhost nova_compute[229496]: pc-q35-rhel9.8.0 Feb 20 04:24:01 localhost nova_compute[229496]: i686 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: rom Feb 20 04:24:01 localhost nova_compute[229496]: pflash Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: yes Feb 20 04:24:01 localhost nova_compute[229496]: no Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: no Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: on Feb 20 04:24:01 localhost nova_compute[229496]: off Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: on Feb 20 04:24:01 localhost nova_compute[229496]: off Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:01 localhost nova_compute[229496]: AMD Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: 486 Feb 20 04:24:01 localhost nova_compute[229496]: 486-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-IBRS Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-noTSX Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-noTSX-IBRS Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-v3 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Broadwell-v4 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server-noTSX Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server-v3 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server-v4 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cascadelake-Server-v5 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: ClearwaterForest Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: ClearwaterForest-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Conroe Feb 20 04:24:01 localhost nova_compute[229496]: Conroe-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Cooperlake Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cooperlake-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Cooperlake-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Denverton Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Denverton-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Denverton-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Denverton-v3 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Dhyana Feb 20 04:24:01 localhost nova_compute[229496]: Dhyana-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Dhyana-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Genoa Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Genoa-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Genoa-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-IBPB Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Milan Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Milan-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Milan-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Milan-v3 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome-v2 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome-v3 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome-v4 Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Rome-v5 Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Turin Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-Turin-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-v1 Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-v2 Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-v3 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-v4 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: EPYC-v5 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: GraniteRapids Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: GraniteRapids-v1 Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:01 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v6 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v7 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Penryn Feb 20 04:24:02 localhost nova_compute[229496]: Penryn-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Westmere Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v2 Feb 20 04:24:02 localhost nova_compute[229496]: athlon Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: athlon-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: kvm32 Feb 20 04:24:02 localhost nova_compute[229496]: kvm32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: n270 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: n270-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pentium Feb 20 04:24:02 localhost nova_compute[229496]: pentium-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: phenom Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: phenom-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu32 Feb 20 04:24:02 localhost nova_compute[229496]: qemu32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: anonymous Feb 20 04:24:02 localhost nova_compute[229496]: memfd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: disk Feb 20 04:24:02 localhost nova_compute[229496]: cdrom Feb 20 04:24:02 localhost nova_compute[229496]: floppy Feb 20 04:24:02 localhost nova_compute[229496]: lun Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: fdc Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: sata Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: vnc Feb 20 04:24:02 localhost nova_compute[229496]: egl-headless Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: subsystem Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: mandatory Feb 20 04:24:02 localhost nova_compute[229496]: requisite Feb 20 04:24:02 localhost nova_compute[229496]: optional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: pci Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: random Feb 20 04:24:02 localhost nova_compute[229496]: egd Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: path Feb 20 04:24:02 localhost nova_compute[229496]: handle Feb 20 04:24:02 localhost nova_compute[229496]: virtiofs Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: tpm-tis Feb 20 04:24:02 localhost nova_compute[229496]: tpm-crb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: emulator Feb 20 04:24:02 localhost nova_compute[229496]: external Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 2.0 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: passt Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: isa Feb 20 04:24:02 localhost nova_compute[229496]: hyperv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: null Feb 20 04:24:02 localhost nova_compute[229496]: vc Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: dev Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: pipe Feb 20 04:24:02 localhost nova_compute[229496]: stdio Feb 20 04:24:02 localhost nova_compute[229496]: udp Feb 20 04:24:02 localhost nova_compute[229496]: tcp Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: qemu-vdagent Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: relaxed Feb 20 04:24:02 localhost nova_compute[229496]: vapic Feb 20 04:24:02 localhost nova_compute[229496]: spinlocks Feb 20 04:24:02 localhost nova_compute[229496]: vpindex Feb 20 04:24:02 localhost nova_compute[229496]: runtime Feb 20 04:24:02 localhost nova_compute[229496]: synic Feb 20 04:24:02 localhost nova_compute[229496]: stimer Feb 20 04:24:02 localhost nova_compute[229496]: reset Feb 20 04:24:02 localhost nova_compute[229496]: vendor_id Feb 20 04:24:02 localhost nova_compute[229496]: frequencies Feb 20 04:24:02 localhost nova_compute[229496]: reenlightenment Feb 20 04:24:02 localhost nova_compute[229496]: tlbflush Feb 20 04:24:02 localhost nova_compute[229496]: ipi Feb 20 04:24:02 localhost nova_compute[229496]: avic Feb 20 04:24:02 localhost nova_compute[229496]: emsr_bitmap Feb 20 04:24:02 localhost nova_compute[229496]: xmm_input Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 4095 Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Linux KVM Hv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:01.995 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: /usr/libexec/qemu-kvm Feb 20 04:24:02 localhost nova_compute[229496]: kvm Feb 20 04:24:02 localhost nova_compute[229496]: pc-i440fx-rhel7.6.0 Feb 20 04:24:02 localhost nova_compute[229496]: i686 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: rom Feb 20 04:24:02 localhost nova_compute[229496]: pflash Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: yes Feb 20 04:24:02 localhost nova_compute[229496]: no Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: no Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:02 localhost nova_compute[229496]: AMD Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 486 Feb 20 04:24:02 localhost nova_compute[229496]: 486-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ClearwaterForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ClearwaterForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Conroe Feb 20 04:24:02 localhost nova_compute[229496]: Conroe-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-IBPB Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v4 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v5 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Turin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Turin-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v1 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v2 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v6 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v7 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Penryn Feb 20 04:24:02 localhost nova_compute[229496]: Penryn-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Westmere Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v2 Feb 20 04:24:02 localhost nova_compute[229496]: athlon Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: athlon-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: kvm32 Feb 20 04:24:02 localhost nova_compute[229496]: kvm32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: n270 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: n270-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pentium Feb 20 04:24:02 localhost nova_compute[229496]: pentium-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: phenom Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: phenom-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu32 Feb 20 04:24:02 localhost nova_compute[229496]: qemu32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: anonymous Feb 20 04:24:02 localhost nova_compute[229496]: memfd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: disk Feb 20 04:24:02 localhost nova_compute[229496]: cdrom Feb 20 04:24:02 localhost nova_compute[229496]: floppy Feb 20 04:24:02 localhost nova_compute[229496]: lun Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ide Feb 20 04:24:02 localhost nova_compute[229496]: fdc Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: sata Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: vnc Feb 20 04:24:02 localhost nova_compute[229496]: egl-headless Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: subsystem Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: mandatory Feb 20 04:24:02 localhost nova_compute[229496]: requisite Feb 20 04:24:02 localhost nova_compute[229496]: optional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: pci Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: random Feb 20 04:24:02 localhost nova_compute[229496]: egd Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: path Feb 20 04:24:02 localhost nova_compute[229496]: handle Feb 20 04:24:02 localhost nova_compute[229496]: virtiofs Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: tpm-tis Feb 20 04:24:02 localhost nova_compute[229496]: tpm-crb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: emulator Feb 20 04:24:02 localhost nova_compute[229496]: external Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 2.0 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: passt Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: isa Feb 20 04:24:02 localhost nova_compute[229496]: hyperv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: null Feb 20 04:24:02 localhost nova_compute[229496]: vc Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: dev Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: pipe Feb 20 04:24:02 localhost nova_compute[229496]: stdio Feb 20 04:24:02 localhost nova_compute[229496]: udp Feb 20 04:24:02 localhost nova_compute[229496]: tcp Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: qemu-vdagent Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: relaxed Feb 20 04:24:02 localhost nova_compute[229496]: vapic Feb 20 04:24:02 localhost nova_compute[229496]: spinlocks Feb 20 04:24:02 localhost nova_compute[229496]: vpindex Feb 20 04:24:02 localhost nova_compute[229496]: runtime Feb 20 04:24:02 localhost nova_compute[229496]: synic Feb 20 04:24:02 localhost nova_compute[229496]: stimer Feb 20 04:24:02 localhost nova_compute[229496]: reset Feb 20 04:24:02 localhost nova_compute[229496]: vendor_id Feb 20 04:24:02 localhost nova_compute[229496]: frequencies Feb 20 04:24:02 localhost nova_compute[229496]: reenlightenment Feb 20 04:24:02 localhost nova_compute[229496]: tlbflush Feb 20 04:24:02 localhost nova_compute[229496]: ipi Feb 20 04:24:02 localhost nova_compute[229496]: avic Feb 20 04:24:02 localhost nova_compute[229496]: emsr_bitmap Feb 20 04:24:02 localhost nova_compute[229496]: xmm_input Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 4095 Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Linux KVM Hv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.027 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.035 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: /usr/libexec/qemu-kvm Feb 20 04:24:02 localhost nova_compute[229496]: kvm Feb 20 04:24:02 localhost nova_compute[229496]: pc-q35-rhel9.8.0 Feb 20 04:24:02 localhost nova_compute[229496]: x86_64 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: efi Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 20 04:24:02 localhost nova_compute[229496]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 20 04:24:02 localhost nova_compute[229496]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 20 04:24:02 localhost nova_compute[229496]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: rom Feb 20 04:24:02 localhost nova_compute[229496]: pflash Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: yes Feb 20 04:24:02 localhost nova_compute[229496]: no Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: yes Feb 20 04:24:02 localhost nova_compute[229496]: no Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:02 localhost nova_compute[229496]: AMD Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 486 Feb 20 04:24:02 localhost nova_compute[229496]: 486-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ClearwaterForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ClearwaterForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Conroe Feb 20 04:24:02 localhost nova_compute[229496]: Conroe-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-IBPB Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v4 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v5 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Turin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Turin-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v1 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v2 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v6 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v7 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Penryn Feb 20 04:24:02 localhost nova_compute[229496]: Penryn-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Westmere Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v2 Feb 20 04:24:02 localhost nova_compute[229496]: athlon Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: athlon-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: kvm32 Feb 20 04:24:02 localhost nova_compute[229496]: kvm32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: n270 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: n270-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pentium Feb 20 04:24:02 localhost nova_compute[229496]: pentium-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: phenom Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: phenom-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu32 Feb 20 04:24:02 localhost nova_compute[229496]: qemu32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: anonymous Feb 20 04:24:02 localhost nova_compute[229496]: memfd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: disk Feb 20 04:24:02 localhost nova_compute[229496]: cdrom Feb 20 04:24:02 localhost nova_compute[229496]: floppy Feb 20 04:24:02 localhost nova_compute[229496]: lun Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: fdc Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: sata Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: vnc Feb 20 04:24:02 localhost nova_compute[229496]: egl-headless Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: subsystem Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: mandatory Feb 20 04:24:02 localhost nova_compute[229496]: requisite Feb 20 04:24:02 localhost nova_compute[229496]: optional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: pci Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: random Feb 20 04:24:02 localhost nova_compute[229496]: egd Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: path Feb 20 04:24:02 localhost nova_compute[229496]: handle Feb 20 04:24:02 localhost nova_compute[229496]: virtiofs Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: tpm-tis Feb 20 04:24:02 localhost nova_compute[229496]: tpm-crb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: emulator Feb 20 04:24:02 localhost nova_compute[229496]: external Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 2.0 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: passt Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: isa Feb 20 04:24:02 localhost nova_compute[229496]: hyperv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: null Feb 20 04:24:02 localhost nova_compute[229496]: vc Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: dev Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: pipe Feb 20 04:24:02 localhost nova_compute[229496]: stdio Feb 20 04:24:02 localhost nova_compute[229496]: udp Feb 20 04:24:02 localhost nova_compute[229496]: tcp Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: qemu-vdagent Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: relaxed Feb 20 04:24:02 localhost nova_compute[229496]: vapic Feb 20 04:24:02 localhost nova_compute[229496]: spinlocks Feb 20 04:24:02 localhost nova_compute[229496]: vpindex Feb 20 04:24:02 localhost nova_compute[229496]: runtime Feb 20 04:24:02 localhost nova_compute[229496]: synic Feb 20 04:24:02 localhost nova_compute[229496]: stimer Feb 20 04:24:02 localhost nova_compute[229496]: reset Feb 20 04:24:02 localhost nova_compute[229496]: vendor_id Feb 20 04:24:02 localhost nova_compute[229496]: frequencies Feb 20 04:24:02 localhost nova_compute[229496]: reenlightenment Feb 20 04:24:02 localhost nova_compute[229496]: tlbflush Feb 20 04:24:02 localhost nova_compute[229496]: ipi Feb 20 04:24:02 localhost nova_compute[229496]: avic Feb 20 04:24:02 localhost nova_compute[229496]: emsr_bitmap Feb 20 04:24:02 localhost nova_compute[229496]: xmm_input Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 4095 Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Linux KVM Hv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.094 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: /usr/libexec/qemu-kvm Feb 20 04:24:02 localhost nova_compute[229496]: kvm Feb 20 04:24:02 localhost nova_compute[229496]: pc-i440fx-rhel7.6.0 Feb 20 04:24:02 localhost nova_compute[229496]: x86_64 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: rom Feb 20 04:24:02 localhost nova_compute[229496]: pflash Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: yes Feb 20 04:24:02 localhost nova_compute[229496]: no Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: no Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:02 localhost nova_compute[229496]: AMD Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 486 Feb 20 04:24:02 localhost nova_compute[229496]: 486-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Broadwell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cascadelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ClearwaterForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ClearwaterForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Conroe Feb 20 04:24:02 localhost nova_compute[229496]: Conroe-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Cooperlake-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Denverton-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Dhyana-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Genoa-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-IBPB Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Milan-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v4 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Rome-v5 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Turin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-Turin-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v1 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v2 Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: EPYC-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: GraniteRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Haswell-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-noTSX Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v6 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Icelake-Server-v7 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: IvyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: KnightsMill-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Nehalem-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G1-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G4-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Opteron_G5-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Penryn Feb 20 04:24:02 localhost nova_compute[229496]: Penryn-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: SandyBridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SapphireRapids-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: SierraForest-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Client-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-noTSX-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Skylake-Server-v5 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v2 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v3 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Snowridge-v4 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Westmere Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-IBRS Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Westmere-v2 Feb 20 04:24:02 localhost nova_compute[229496]: athlon Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: athlon-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: core2duo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: coreduo-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: kvm32 Feb 20 04:24:02 localhost nova_compute[229496]: kvm32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64 Feb 20 04:24:02 localhost nova_compute[229496]: kvm64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: n270 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: n270-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pentium Feb 20 04:24:02 localhost nova_compute[229496]: pentium-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2 Feb 20 04:24:02 localhost nova_compute[229496]: pentium2-v1 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3 Feb 20 04:24:02 localhost nova_compute[229496]: pentium3-v1 Feb 20 04:24:02 localhost nova_compute[229496]: phenom Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: phenom-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu32 Feb 20 04:24:02 localhost nova_compute[229496]: qemu32-v1 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64 Feb 20 04:24:02 localhost nova_compute[229496]: qemu64-v1 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: anonymous Feb 20 04:24:02 localhost nova_compute[229496]: memfd Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: disk Feb 20 04:24:02 localhost nova_compute[229496]: cdrom Feb 20 04:24:02 localhost nova_compute[229496]: floppy Feb 20 04:24:02 localhost nova_compute[229496]: lun Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: ide Feb 20 04:24:02 localhost nova_compute[229496]: fdc Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: sata Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: vnc Feb 20 04:24:02 localhost nova_compute[229496]: egl-headless Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: subsystem Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: mandatory Feb 20 04:24:02 localhost nova_compute[229496]: requisite Feb 20 04:24:02 localhost nova_compute[229496]: optional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: pci Feb 20 04:24:02 localhost nova_compute[229496]: scsi Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: virtio Feb 20 04:24:02 localhost nova_compute[229496]: virtio-transitional Feb 20 04:24:02 localhost nova_compute[229496]: virtio-non-transitional Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: random Feb 20 04:24:02 localhost nova_compute[229496]: egd Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: path Feb 20 04:24:02 localhost nova_compute[229496]: handle Feb 20 04:24:02 localhost nova_compute[229496]: virtiofs Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: tpm-tis Feb 20 04:24:02 localhost nova_compute[229496]: tpm-crb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: emulator Feb 20 04:24:02 localhost nova_compute[229496]: external Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 2.0 Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: usb Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: qemu Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: builtin Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: default Feb 20 04:24:02 localhost nova_compute[229496]: passt Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: isa Feb 20 04:24:02 localhost nova_compute[229496]: hyperv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: null Feb 20 04:24:02 localhost nova_compute[229496]: vc Feb 20 04:24:02 localhost nova_compute[229496]: pty Feb 20 04:24:02 localhost nova_compute[229496]: dev Feb 20 04:24:02 localhost nova_compute[229496]: file Feb 20 04:24:02 localhost nova_compute[229496]: pipe Feb 20 04:24:02 localhost nova_compute[229496]: stdio Feb 20 04:24:02 localhost nova_compute[229496]: udp Feb 20 04:24:02 localhost nova_compute[229496]: tcp Feb 20 04:24:02 localhost nova_compute[229496]: unix Feb 20 04:24:02 localhost nova_compute[229496]: qemu-vdagent Feb 20 04:24:02 localhost nova_compute[229496]: dbus Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: relaxed Feb 20 04:24:02 localhost nova_compute[229496]: vapic Feb 20 04:24:02 localhost nova_compute[229496]: spinlocks Feb 20 04:24:02 localhost nova_compute[229496]: vpindex Feb 20 04:24:02 localhost nova_compute[229496]: runtime Feb 20 04:24:02 localhost nova_compute[229496]: synic Feb 20 04:24:02 localhost nova_compute[229496]: stimer Feb 20 04:24:02 localhost nova_compute[229496]: reset Feb 20 04:24:02 localhost nova_compute[229496]: vendor_id Feb 20 04:24:02 localhost nova_compute[229496]: frequencies Feb 20 04:24:02 localhost nova_compute[229496]: reenlightenment Feb 20 04:24:02 localhost nova_compute[229496]: tlbflush Feb 20 04:24:02 localhost nova_compute[229496]: ipi Feb 20 04:24:02 localhost nova_compute[229496]: avic Feb 20 04:24:02 localhost nova_compute[229496]: emsr_bitmap Feb 20 04:24:02 localhost nova_compute[229496]: xmm_input Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: 4095 Feb 20 04:24:02 localhost nova_compute[229496]: on Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: off Feb 20 04:24:02 localhost nova_compute[229496]: Linux KVM Hv Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: Feb 20 04:24:02 localhost nova_compute[229496]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.198 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.199 229500 INFO nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Secure Boot support detected#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.203 229500 INFO nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.204 229500 INFO nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.217 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.267 229500 INFO nova.virt.node [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.290 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Verified node 41976f9f-3656-482f-8ad0-c81e454a3952 matches my host np0005625204.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.331 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.335 229500 DEBUG nova.virt.libvirt.vif [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005625204.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-02-20T08:23:36Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.336 229500 DEBUG nova.network.os_vif_util [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.337 229500 DEBUG nova.network.os_vif_util [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.338 229500 DEBUG os_vif [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 20 04:24:02 localhost python3.9[229755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.380 229500 DEBUG ovsdbapp.backend.ovs_idl [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.381 229500 DEBUG ovsdbapp.backend.ovs_idl [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.381 229500 DEBUG ovsdbapp.backend.ovs_idl [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.381 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.382 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.382 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.383 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.402 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.402 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.403 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.404 229500 INFO oslo.privsep.daemon [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpav37yiqy/privsep.sock']#033[00m Feb 20 04:24:02 localhost python3.9[229849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579441.9562454-3732-49687863207863/.source.yaml _original_basename=.t701q_8c follow=False checksum=a8e9a640ed2d11815875c8a03dd8e15172eb268a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.988 229500 INFO oslo.privsep.daemon [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.897 229850 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.900 229850 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.905 229850 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Feb 20 04:24:02 localhost nova_compute[229496]: 2026-02-20 09:24:02.905 229850 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229850#033[00m Feb 20 04:24:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9080 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996BD690000000001030307) Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.218 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.270 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.270 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.271 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.272 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.272 229500 INFO os_vif [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.273 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.277 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.277 229500 INFO nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.686 229500 INFO nova.service [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating service version for nova-compute on np0005625204.localdomain from 57 to 66#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.755 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.756 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.756 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.756 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:24:03 localhost nova_compute[229496]: 2026-02-20 09:24:03.757 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:24:04 localhost python3.9[229963]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.226 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.293 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.294 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:24:04 localhost systemd[1]: Started libvirt nodedev daemon. Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.660 229500 WARNING nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.662 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12959MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.663 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.663 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.840 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.840 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.841 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.860 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.881 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.881 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.905 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.961 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AESNI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_FMA3,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:24:04 localhost nova_compute[229496]: 2026-02-20 09:24:04.997 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:24:05 localhost sshd[230135]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:24:05 localhost python3.9[230134]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.450 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.455 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Feb 20 04:24:05 localhost nova_compute[229496]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.455 229500 INFO nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] kernel doesn't support AMD SEV#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.456 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.456 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.520 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updated inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.521 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.521 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.625 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.656 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.656 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.656 229500 DEBUG nova.service [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.697 229500 DEBUG nova.service [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Feb 20 04:24:05 localhost nova_compute[229496]: 2026-02-20 09:24:05.698 229500 DEBUG nova.servicegroup.drivers.db [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] DB_Driver: join new ServiceGroup member np0005625204.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Feb 20 04:24:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:24:05.980 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:24:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:24:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:24:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:24:05.982 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:24:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4659 DF PROTO=TCP SPT=40096 DPT=9101 SEQ=2473666474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996CA280000000001030307) Feb 20 04:24:06 localhost python3.9[230246]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:24:07 localhost nova_compute[229496]: 2026-02-20 09:24:07.387 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:07 localhost python3.9[230356]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 20 04:24:07 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 115.0 (383 of 333 items), suggesting rotation. Feb 20 04:24:07 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:24:07 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:24:08 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:24:08 localhost nova_compute[229496]: 2026-02-20 09:24:08.254 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:24:08 localhost python3.9[230490]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:24:08 localhost systemd[1]: Stopping nova_compute container... Feb 20 04:24:09 localhost systemd[1]: tmp-crun.AK0QJc.mount: Deactivated successfully. Feb 20 04:24:09 localhost journal[206495]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, ) Feb 20 04:24:09 localhost journal[206495]: hostname: np0005625204.localdomain Feb 20 04:24:09 localhost journal[206495]: End of file while reading data: Input/output error Feb 20 04:24:09 localhost systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Deactivated successfully. Feb 20 04:24:09 localhost systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Consumed 4.460s CPU time. Feb 20 04:24:09 localhost podman[230494]: 2026-02-20 09:24:09.092365802 +0000 UTC m=+0.092978371 container died 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute) Feb 20 04:24:09 localhost systemd[1]: tmp-crun.syYPC6.mount: Deactivated successfully. Feb 20 04:24:09 localhost podman[230494]: 2026-02-20 09:24:09.218929613 +0000 UTC m=+0.219542162 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Feb 20 04:24:09 localhost podman[230494]: nova_compute Feb 20 04:24:09 localhost podman[230534]: error opening file `/run/crun/299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782/status`: No such file or directory Feb 20 04:24:09 localhost podman[230523]: 2026-02-20 09:24:09.321770276 +0000 UTC m=+0.064224941 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:24:09 localhost podman[230523]: nova_compute Feb 20 04:24:09 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Feb 20 04:24:09 localhost systemd[1]: Stopped nova_compute container. Feb 20 04:24:09 localhost systemd[1]: Starting nova_compute container... Feb 20 04:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4661 DF PROTO=TCP SPT=40096 DPT=9101 SEQ=2473666474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996D6280000000001030307) Feb 20 04:24:09 localhost systemd[1]: Started libcrun container. Feb 20 04:24:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:09 localhost podman[230538]: 2026-02-20 09:24:09.453373078 +0000 UTC m=+0.102332648 container init 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3) Feb 20 04:24:09 localhost nova_compute[230552]: + sudo -E kolla_set_configs Feb 20 04:24:09 localhost podman[230538]: 2026-02-20 09:24:09.466135126 +0000 UTC m=+0.115094696 container start 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3) Feb 20 04:24:09 localhost podman[230538]: nova_compute Feb 20 04:24:09 localhost systemd[1]: Started nova_compute container. Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Validating config file Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying service configuration files Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /etc/ceph Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Creating directory /etc/ceph Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Writing out command to execute Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:24:09 localhost nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:24:09 localhost nova_compute[230552]: ++ cat /run_command Feb 20 04:24:09 localhost nova_compute[230552]: + CMD=nova-compute Feb 20 04:24:09 localhost nova_compute[230552]: + ARGS= Feb 20 04:24:09 localhost nova_compute[230552]: + sudo kolla_copy_cacerts Feb 20 04:24:09 localhost nova_compute[230552]: + [[ ! -n '' ]] Feb 20 04:24:09 localhost nova_compute[230552]: + . kolla_extend_start Feb 20 04:24:09 localhost nova_compute[230552]: Running command: 'nova-compute' Feb 20 04:24:09 localhost nova_compute[230552]: + echo 'Running command: '\''nova-compute'\''' Feb 20 04:24:09 localhost nova_compute[230552]: + umask 0022 Feb 20 04:24:09 localhost nova_compute[230552]: + exec nova-compute Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.155 230556 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.156 230556 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.156 230556 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.156 230556 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.270 230556 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.291 230556 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.291 230556 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 20 04:24:11 localhost sshd[230586]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.681 230556 INFO nova.virt.driver [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 20 04:24:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20920 DF PROTO=TCP SPT=36818 DPT=9105 SEQ=3333752055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996DF680000000001030307) Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.798 230556 INFO nova.compute.provider_config [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.837 230556 WARNING nova.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.838 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.838 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.838 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.839 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.839 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.841 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.841 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.841 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.842 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.842 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.842 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.844 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.844 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.844 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console_host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.846 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.846 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.846 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.847 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.847 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.847 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.849 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.849 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.849 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.851 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.851 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.851 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.852 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.852 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.852 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.853 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.853 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.853 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.854 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.854 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.854 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.856 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.856 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.856 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.859 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.859 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.859 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.861 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.861 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.861 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.863 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.863 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.863 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.865 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.865 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.865 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.867 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.867 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.867 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.869 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.869 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.869 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.871 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.871 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.871 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.874 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.874 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.874 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.876 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.876 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.876 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.930 230556 WARNING oslo_config.cfg [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 20 04:24:11 localhost nova_compute[230552]: live_migration_uri is deprecated for removal in favor of two other options that Feb 20 04:24:11 localhost nova_compute[230552]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 20 04:24:11 localhost nova_compute[230552]: and ``live_migration_inbound_addr`` respectively. Feb 20 04:24:11 localhost nova_compute[230552]: ). Its value may be silently ignored in the future.#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_secret_uuid = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 20 04:24:11 localhost nova_compute[230552]: 2026-02-20 09:24:11.995 230556 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.010 230556 INFO nova.virt.node [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.021 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.024 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.024 230556 INFO nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Connection event '1' reason 'None'#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.027 230556 INFO nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host capabilities Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: f44a30b3-674b-4e65-a07d-fb3d71d4ae11 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: x86_64 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v4 Feb 20 04:24:12 localhost nova_compute[230552]: AMD Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: tcp Feb 20 04:24:12 localhost nova_compute[230552]: rdma Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 16116612 Feb 20 04:24:12 localhost nova_compute[230552]: 4029153 Feb 20 04:24:12 localhost nova_compute[230552]: 0 Feb 20 04:24:12 localhost nova_compute[230552]: 0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: selinux Feb 20 04:24:12 localhost nova_compute[230552]: 0 Feb 20 04:24:12 localhost nova_compute[230552]: system_u:system_r:svirt_t:s0 Feb 20 04:24:12 localhost nova_compute[230552]: system_u:system_r:svirt_tcg_t:s0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: dac Feb 20 04:24:12 localhost nova_compute[230552]: 0 Feb 20 04:24:12 localhost nova_compute[230552]: +107:+107 Feb 20 04:24:12 localhost nova_compute[230552]: +107:+107 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: hvm Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 32 Feb 20 04:24:12 localhost nova_compute[230552]: /usr/libexec/qemu-kvm Feb 20 04:24:12 localhost nova_compute[230552]: pc-i440fx-rhel7.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.8.0 Feb 20 04:24:12 localhost nova_compute[230552]: q35 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.4.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.5.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.3.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel7.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.4.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.2.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.2.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.0.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.0.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.1.0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: hvm Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 64 Feb 20 04:24:12 localhost nova_compute[230552]: /usr/libexec/qemu-kvm Feb 20 04:24:12 localhost nova_compute[230552]: pc-i440fx-rhel7.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.8.0 Feb 20 04:24:12 localhost nova_compute[230552]: q35 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.4.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.5.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.3.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel7.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.4.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.2.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.2.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.0.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.0.0 Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel8.1.0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: #033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.035 230556 DEBUG nova.virt.libvirt.volume.mount [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.036 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.042 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/libexec/qemu-kvm Feb 20 04:24:12 localhost nova_compute[230552]: kvm Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.8.0 Feb 20 04:24:12 localhost nova_compute[230552]: i686 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: rom Feb 20 04:24:12 localhost nova_compute[230552]: pflash Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: yes Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: AMD Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 486 Feb 20 04:24:12 localhost nova_compute[230552]: 486-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Conroe Feb 20 04:24:12 localhost nova_compute[230552]: Conroe-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-IBPB Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v4 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v5 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Turin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Turin-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v1 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v2 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v6 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v7 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: KnightsMill Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: KnightsMill-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G1-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G2 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G2-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G3 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G3-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G4-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G5-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Penryn Feb 20 04:24:12 localhost nova_compute[230552]: Penryn-v1 Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Westmere Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-v2 Feb 20 04:24:12 localhost nova_compute[230552]: athlon Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: athlon-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: core2duo Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: core2duo-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: coreduo Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: coreduo-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: kvm32 Feb 20 04:24:12 localhost nova_compute[230552]: kvm32-v1 Feb 20 04:24:12 localhost nova_compute[230552]: kvm64 Feb 20 04:24:12 localhost nova_compute[230552]: kvm64-v1 Feb 20 04:24:12 localhost nova_compute[230552]: n270 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: n270-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: pentium Feb 20 04:24:12 localhost nova_compute[230552]: pentium-v1 Feb 20 04:24:12 localhost nova_compute[230552]: pentium2 Feb 20 04:24:12 localhost nova_compute[230552]: pentium2-v1 Feb 20 04:24:12 localhost nova_compute[230552]: pentium3 Feb 20 04:24:12 localhost nova_compute[230552]: pentium3-v1 Feb 20 04:24:12 localhost nova_compute[230552]: phenom Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: phenom-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: qemu32 Feb 20 04:24:12 localhost nova_compute[230552]: qemu32-v1 Feb 20 04:24:12 localhost nova_compute[230552]: qemu64 Feb 20 04:24:12 localhost nova_compute[230552]: qemu64-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: file Feb 20 04:24:12 localhost nova_compute[230552]: anonymous Feb 20 04:24:12 localhost nova_compute[230552]: memfd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: disk Feb 20 04:24:12 localhost nova_compute[230552]: cdrom Feb 20 04:24:12 localhost nova_compute[230552]: floppy Feb 20 04:24:12 localhost nova_compute[230552]: lun Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: fdc Feb 20 04:24:12 localhost nova_compute[230552]: scsi Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: sata Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: virtio-transitional Feb 20 04:24:12 localhost nova_compute[230552]: virtio-non-transitional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: vnc Feb 20 04:24:12 localhost nova_compute[230552]: egl-headless Feb 20 04:24:12 localhost nova_compute[230552]: dbus Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: subsystem Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: default Feb 20 04:24:12 localhost nova_compute[230552]: mandatory Feb 20 04:24:12 localhost nova_compute[230552]: requisite Feb 20 04:24:12 localhost nova_compute[230552]: optional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: pci Feb 20 04:24:12 localhost nova_compute[230552]: scsi Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: virtio-transitional Feb 20 04:24:12 localhost nova_compute[230552]: virtio-non-transitional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: random Feb 20 04:24:12 localhost nova_compute[230552]: egd Feb 20 04:24:12 localhost nova_compute[230552]: builtin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: path Feb 20 04:24:12 localhost nova_compute[230552]: handle Feb 20 04:24:12 localhost nova_compute[230552]: virtiofs Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: tpm-tis Feb 20 04:24:12 localhost nova_compute[230552]: tpm-crb Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: emulator Feb 20 04:24:12 localhost nova_compute[230552]: external Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 2.0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: pty Feb 20 04:24:12 localhost nova_compute[230552]: unix Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: qemu Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: builtin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: default Feb 20 04:24:12 localhost nova_compute[230552]: passt Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: isa Feb 20 04:24:12 localhost nova_compute[230552]: hyperv Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: null Feb 20 04:24:12 localhost nova_compute[230552]: vc Feb 20 04:24:12 localhost nova_compute[230552]: pty Feb 20 04:24:12 localhost nova_compute[230552]: dev Feb 20 04:24:12 localhost nova_compute[230552]: file Feb 20 04:24:12 localhost nova_compute[230552]: pipe Feb 20 04:24:12 localhost nova_compute[230552]: stdio Feb 20 04:24:12 localhost nova_compute[230552]: udp Feb 20 04:24:12 localhost nova_compute[230552]: tcp Feb 20 04:24:12 localhost nova_compute[230552]: unix Feb 20 04:24:12 localhost nova_compute[230552]: qemu-vdagent Feb 20 04:24:12 localhost nova_compute[230552]: dbus Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: relaxed Feb 20 04:24:12 localhost nova_compute[230552]: vapic Feb 20 04:24:12 localhost nova_compute[230552]: spinlocks Feb 20 04:24:12 localhost nova_compute[230552]: vpindex Feb 20 04:24:12 localhost nova_compute[230552]: runtime Feb 20 04:24:12 localhost nova_compute[230552]: synic Feb 20 04:24:12 localhost nova_compute[230552]: stimer Feb 20 04:24:12 localhost nova_compute[230552]: reset Feb 20 04:24:12 localhost nova_compute[230552]: vendor_id Feb 20 04:24:12 localhost nova_compute[230552]: frequencies Feb 20 04:24:12 localhost nova_compute[230552]: reenlightenment Feb 20 04:24:12 localhost nova_compute[230552]: tlbflush Feb 20 04:24:12 localhost nova_compute[230552]: ipi Feb 20 04:24:12 localhost nova_compute[230552]: avic Feb 20 04:24:12 localhost nova_compute[230552]: emsr_bitmap Feb 20 04:24:12 localhost nova_compute[230552]: xmm_input Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 4095 Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Linux KVM Hv Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.049 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/libexec/qemu-kvm Feb 20 04:24:12 localhost nova_compute[230552]: kvm Feb 20 04:24:12 localhost nova_compute[230552]: pc-i440fx-rhel7.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: i686 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: rom Feb 20 04:24:12 localhost nova_compute[230552]: pflash Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: yes Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: AMD Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 486 Feb 20 04:24:12 localhost nova_compute[230552]: 486-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost python3.9[230679]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Conroe Feb 20 04:24:12 localhost nova_compute[230552]: Conroe-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-IBPB Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v4 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v5 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Turin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Turin-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v1 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v2 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v6 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v7 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: KnightsMill Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: KnightsMill-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G1-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G2 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G2-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G3 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G3-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G4-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G5-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Penryn Feb 20 04:24:12 localhost nova_compute[230552]: Penryn-v1 Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Westmere Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-v2 Feb 20 04:24:12 localhost nova_compute[230552]: athlon Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: athlon-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: core2duo Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: core2duo-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: coreduo Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: coreduo-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: kvm32 Feb 20 04:24:12 localhost nova_compute[230552]: kvm32-v1 Feb 20 04:24:12 localhost nova_compute[230552]: kvm64 Feb 20 04:24:12 localhost nova_compute[230552]: kvm64-v1 Feb 20 04:24:12 localhost nova_compute[230552]: n270 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: n270-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: pentium Feb 20 04:24:12 localhost nova_compute[230552]: pentium-v1 Feb 20 04:24:12 localhost nova_compute[230552]: pentium2 Feb 20 04:24:12 localhost nova_compute[230552]: pentium2-v1 Feb 20 04:24:12 localhost nova_compute[230552]: pentium3 Feb 20 04:24:12 localhost nova_compute[230552]: pentium3-v1 Feb 20 04:24:12 localhost nova_compute[230552]: phenom Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: phenom-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: qemu32 Feb 20 04:24:12 localhost nova_compute[230552]: qemu32-v1 Feb 20 04:24:12 localhost nova_compute[230552]: qemu64 Feb 20 04:24:12 localhost nova_compute[230552]: qemu64-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: file Feb 20 04:24:12 localhost nova_compute[230552]: anonymous Feb 20 04:24:12 localhost nova_compute[230552]: memfd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: disk Feb 20 04:24:12 localhost nova_compute[230552]: cdrom Feb 20 04:24:12 localhost nova_compute[230552]: floppy Feb 20 04:24:12 localhost nova_compute[230552]: lun Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ide Feb 20 04:24:12 localhost nova_compute[230552]: fdc Feb 20 04:24:12 localhost nova_compute[230552]: scsi Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: sata Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: virtio-transitional Feb 20 04:24:12 localhost nova_compute[230552]: virtio-non-transitional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: vnc Feb 20 04:24:12 localhost nova_compute[230552]: egl-headless Feb 20 04:24:12 localhost nova_compute[230552]: dbus Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: subsystem Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: default Feb 20 04:24:12 localhost nova_compute[230552]: mandatory Feb 20 04:24:12 localhost nova_compute[230552]: requisite Feb 20 04:24:12 localhost nova_compute[230552]: optional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: pci Feb 20 04:24:12 localhost nova_compute[230552]: scsi Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: virtio-transitional Feb 20 04:24:12 localhost nova_compute[230552]: virtio-non-transitional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: random Feb 20 04:24:12 localhost nova_compute[230552]: egd Feb 20 04:24:12 localhost nova_compute[230552]: builtin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: path Feb 20 04:24:12 localhost nova_compute[230552]: handle Feb 20 04:24:12 localhost nova_compute[230552]: virtiofs Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: tpm-tis Feb 20 04:24:12 localhost nova_compute[230552]: tpm-crb Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: emulator Feb 20 04:24:12 localhost nova_compute[230552]: external Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 2.0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: pty Feb 20 04:24:12 localhost nova_compute[230552]: unix Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: qemu Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: builtin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: default Feb 20 04:24:12 localhost nova_compute[230552]: passt Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: isa Feb 20 04:24:12 localhost nova_compute[230552]: hyperv Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: null Feb 20 04:24:12 localhost nova_compute[230552]: vc Feb 20 04:24:12 localhost nova_compute[230552]: pty Feb 20 04:24:12 localhost nova_compute[230552]: dev Feb 20 04:24:12 localhost nova_compute[230552]: file Feb 20 04:24:12 localhost nova_compute[230552]: pipe Feb 20 04:24:12 localhost nova_compute[230552]: stdio Feb 20 04:24:12 localhost nova_compute[230552]: udp Feb 20 04:24:12 localhost nova_compute[230552]: tcp Feb 20 04:24:12 localhost nova_compute[230552]: unix Feb 20 04:24:12 localhost nova_compute[230552]: qemu-vdagent Feb 20 04:24:12 localhost nova_compute[230552]: dbus Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: relaxed Feb 20 04:24:12 localhost nova_compute[230552]: vapic Feb 20 04:24:12 localhost nova_compute[230552]: spinlocks Feb 20 04:24:12 localhost nova_compute[230552]: vpindex Feb 20 04:24:12 localhost nova_compute[230552]: runtime Feb 20 04:24:12 localhost nova_compute[230552]: synic Feb 20 04:24:12 localhost nova_compute[230552]: stimer Feb 20 04:24:12 localhost nova_compute[230552]: reset Feb 20 04:24:12 localhost nova_compute[230552]: vendor_id Feb 20 04:24:12 localhost nova_compute[230552]: frequencies Feb 20 04:24:12 localhost nova_compute[230552]: reenlightenment Feb 20 04:24:12 localhost nova_compute[230552]: tlbflush Feb 20 04:24:12 localhost nova_compute[230552]: ipi Feb 20 04:24:12 localhost nova_compute[230552]: avic Feb 20 04:24:12 localhost nova_compute[230552]: emsr_bitmap Feb 20 04:24:12 localhost nova_compute[230552]: xmm_input Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 4095 Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Linux KVM Hv Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.098 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.106 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/libexec/qemu-kvm Feb 20 04:24:12 localhost nova_compute[230552]: kvm Feb 20 04:24:12 localhost nova_compute[230552]: pc-q35-rhel9.8.0 Feb 20 04:24:12 localhost nova_compute[230552]: x86_64 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: efi Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: rom Feb 20 04:24:12 localhost nova_compute[230552]: pflash Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: yes Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: yes Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: AMD Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 486 Feb 20 04:24:12 localhost nova_compute[230552]: 486-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Conroe Feb 20 04:24:12 localhost nova_compute[230552]: Conroe-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cooperlake-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Denverton-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Dhyana-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Genoa-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-IBPB Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Milan-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v4 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome-v5 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Turin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Turin-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v1 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v2 Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: GraniteRapids-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Haswell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v6 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Icelake-Server-v7 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: IvyBridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: KnightsMill Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: KnightsMill-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Nehalem-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G1-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G2 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G2-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G3 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G3-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G4-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Opteron_G5-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Penryn Feb 20 04:24:12 localhost nova_compute[230552]: Penryn-v1 Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: SandyBridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SapphireRapids-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: SierraForest-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Client-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Skylake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Snowridge-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Westmere Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Westmere-v2 Feb 20 04:24:12 localhost nova_compute[230552]: athlon Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: athlon-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: core2duo Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: core2duo-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: coreduo Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: coreduo-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: kvm32 Feb 20 04:24:12 localhost nova_compute[230552]: kvm32-v1 Feb 20 04:24:12 localhost nova_compute[230552]: kvm64 Feb 20 04:24:12 localhost nova_compute[230552]: kvm64-v1 Feb 20 04:24:12 localhost nova_compute[230552]: n270 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: n270-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: pentium Feb 20 04:24:12 localhost nova_compute[230552]: pentium-v1 Feb 20 04:24:12 localhost nova_compute[230552]: pentium2 Feb 20 04:24:12 localhost nova_compute[230552]: pentium2-v1 Feb 20 04:24:12 localhost nova_compute[230552]: pentium3 Feb 20 04:24:12 localhost nova_compute[230552]: pentium3-v1 Feb 20 04:24:12 localhost nova_compute[230552]: phenom Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: phenom-v1 Feb 20 04:24:12 localhost systemd[1]: Started libpod-conmon-8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4.scope. Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: qemu32 Feb 20 04:24:12 localhost nova_compute[230552]: qemu32-v1 Feb 20 04:24:12 localhost nova_compute[230552]: qemu64 Feb 20 04:24:12 localhost nova_compute[230552]: qemu64-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: file Feb 20 04:24:12 localhost nova_compute[230552]: anonymous Feb 20 04:24:12 localhost nova_compute[230552]: memfd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: disk Feb 20 04:24:12 localhost nova_compute[230552]: cdrom Feb 20 04:24:12 localhost nova_compute[230552]: floppy Feb 20 04:24:12 localhost nova_compute[230552]: lun Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: fdc Feb 20 04:24:12 localhost nova_compute[230552]: scsi Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: sata Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: virtio-transitional Feb 20 04:24:12 localhost nova_compute[230552]: virtio-non-transitional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: vnc Feb 20 04:24:12 localhost nova_compute[230552]: egl-headless Feb 20 04:24:12 localhost nova_compute[230552]: dbus Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: subsystem Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: default Feb 20 04:24:12 localhost nova_compute[230552]: mandatory Feb 20 04:24:12 localhost nova_compute[230552]: requisite Feb 20 04:24:12 localhost nova_compute[230552]: optional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: pci Feb 20 04:24:12 localhost nova_compute[230552]: scsi Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: virtio Feb 20 04:24:12 localhost nova_compute[230552]: virtio-transitional Feb 20 04:24:12 localhost nova_compute[230552]: virtio-non-transitional Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: random Feb 20 04:24:12 localhost nova_compute[230552]: egd Feb 20 04:24:12 localhost nova_compute[230552]: builtin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: path Feb 20 04:24:12 localhost nova_compute[230552]: handle Feb 20 04:24:12 localhost nova_compute[230552]: virtiofs Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: tpm-tis Feb 20 04:24:12 localhost nova_compute[230552]: tpm-crb Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: emulator Feb 20 04:24:12 localhost nova_compute[230552]: external Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 2.0 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: usb Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: pty Feb 20 04:24:12 localhost nova_compute[230552]: unix Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: qemu Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: builtin Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: default Feb 20 04:24:12 localhost nova_compute[230552]: passt Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: isa Feb 20 04:24:12 localhost nova_compute[230552]: hyperv Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: null Feb 20 04:24:12 localhost nova_compute[230552]: vc Feb 20 04:24:12 localhost nova_compute[230552]: pty Feb 20 04:24:12 localhost nova_compute[230552]: dev Feb 20 04:24:12 localhost nova_compute[230552]: file Feb 20 04:24:12 localhost nova_compute[230552]: pipe Feb 20 04:24:12 localhost nova_compute[230552]: stdio Feb 20 04:24:12 localhost nova_compute[230552]: udp Feb 20 04:24:12 localhost nova_compute[230552]: tcp Feb 20 04:24:12 localhost nova_compute[230552]: unix Feb 20 04:24:12 localhost nova_compute[230552]: qemu-vdagent Feb 20 04:24:12 localhost nova_compute[230552]: dbus Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: relaxed Feb 20 04:24:12 localhost nova_compute[230552]: vapic Feb 20 04:24:12 localhost nova_compute[230552]: spinlocks Feb 20 04:24:12 localhost nova_compute[230552]: vpindex Feb 20 04:24:12 localhost nova_compute[230552]: runtime Feb 20 04:24:12 localhost nova_compute[230552]: synic Feb 20 04:24:12 localhost nova_compute[230552]: stimer Feb 20 04:24:12 localhost nova_compute[230552]: reset Feb 20 04:24:12 localhost nova_compute[230552]: vendor_id Feb 20 04:24:12 localhost nova_compute[230552]: frequencies Feb 20 04:24:12 localhost nova_compute[230552]: reenlightenment Feb 20 04:24:12 localhost nova_compute[230552]: tlbflush Feb 20 04:24:12 localhost nova_compute[230552]: ipi Feb 20 04:24:12 localhost nova_compute[230552]: avic Feb 20 04:24:12 localhost nova_compute[230552]: emsr_bitmap Feb 20 04:24:12 localhost nova_compute[230552]: xmm_input Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 4095 Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Linux KVM Hv Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:24:12 localhost nova_compute[230552]: 2026-02-20 09:24:12.213 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/libexec/qemu-kvm Feb 20 04:24:12 localhost nova_compute[230552]: kvm Feb 20 04:24:12 localhost nova_compute[230552]: pc-i440fx-rhel7.6.0 Feb 20 04:24:12 localhost nova_compute[230552]: x86_64 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: rom Feb 20 04:24:12 localhost nova_compute[230552]: pflash Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: yes Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: no Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: on Feb 20 04:24:12 localhost nova_compute[230552]: off Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: EPYC-Rome Feb 20 04:24:12 localhost nova_compute[230552]: AMD Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost systemd[1]: Started libcrun container. Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: 486 Feb 20 04:24:12 localhost nova_compute[230552]: 486-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-noTSX-IBRS Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Broadwell-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-noTSX Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v1 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v2 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v3 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v4 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Cascadelake-Server-v5 Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: ClearwaterForest Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:24:12 localhost nova_compute[230552]: Feb 20 04:25:58 localhost python3.9[241812]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:25:58 localhost rsyslogd[758]: imjournal: 4238 messages lost due to rate-limiting (20000 allowed within 600 seconds) Feb 20 04:25:58 localhost systemd[1]: Reloading. Feb 20 04:25:58 localhost systemd-sysv-generator[241840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:25:58 localhost systemd-rc-local-generator[241834]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:58 localhost nova_compute[230552]: 2026-02-20 09:25:58.562 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:25:58 localhost nova_compute[230552]: 2026-02-20 09:25:58.565 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:25:59 localhost python3.9[241902]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:25:59 localhost systemd[1]: Reloading. Feb 20 04:25:59 localhost systemd-sysv-generator[241931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:25:59 localhost systemd-rc-local-generator[241927]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:25:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31636 DF PROTO=TCP SPT=54660 DPT=9105 SEQ=2177006928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599884E80000000001030307) Feb 20 04:25:59 localhost systemd[1]: Starting podman_exporter container... Feb 20 04:25:59 localhost systemd[1]: Started libcrun container. Feb 20 04:25:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:25:59 localhost podman[241943]: 2026-02-20 09:25:59.862294533 +0000 UTC m=+0.173733827 container init 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:25:59 localhost podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)" Feb 20 04:25:59 localhost podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=exporter.go:69 level=info msg=metrics enhanced=false Feb 20 04:25:59 localhost podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=handler.go:94 level=info msg="enabled collectors" Feb 20 04:25:59 localhost podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=handler.go:105 level=info collector=container Feb 20 04:25:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:25:59 localhost systemd[1]: Starting Podman API Service... Feb 20 04:25:59 localhost systemd[1]: Started Podman API Service. Feb 20 04:25:59 localhost podman[241943]: 2026-02-20 09:25:59.894027059 +0000 UTC m=+0.205466393 container start 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:25:59 localhost podman[241943]: podman_exporter Feb 20 04:25:59 localhost systemd[1]: Started podman_exporter container. Feb 20 04:25:59 localhost podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="/usr/bin/podman filtering at log level info" Feb 20 04:25:59 localhost podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Feb 20 04:25:59 localhost podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="Setting parallel job count to 25" Feb 20 04:25:59 localhost podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="Using systemd socket activation to determine API endpoint" Feb 20 04:25:59 localhost podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\"" Feb 20 04:25:59 localhost podman[241968]: @ - - [20/Feb/2026:09:25:59 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1" Feb 20 04:25:59 localhost podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:26:00 localhost podman[241967]: 2026-02-20 09:26:00.567895535 +0000 UTC m=+0.667012170 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:26:00 localhost podman[241967]: 2026-02-20 09:26:00.579958764 +0000 UTC m=+0.679075419 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:26:00 localhost podman[241967]: unhealthy Feb 20 04:26:01 localhost python3.9[242112]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:26:02 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:26:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36412 DF PROTO=TCP SPT=59782 DPT=9882 SEQ=722642973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599891690000000001030307) Feb 20 04:26:02 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:26:02 localhost python3.9[242222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:26:03 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:26:03 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:26:03 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:26:03 localhost python3.9[242312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579562.472272-2502-233918779226488/.source.yaml _original_basename=.7fgbausk follow=False checksum=dae36056a950a4131d7691afd655cacfc03f4930 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:03 localhost nova_compute[230552]: 2026-02-20 09:26:03.566 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:03 localhost nova_compute[230552]: 2026-02-20 09:26:03.568 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:03 localhost nova_compute[230552]: 2026-02-20 09:26:03.568 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:03 localhost nova_compute[230552]: 2026-02-20 09:26:03.568 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:03 localhost nova_compute[230552]: 2026-02-20 09:26:03.605 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:03 localhost nova_compute[230552]: 2026-02-20 09:26:03.606 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:04 localhost python3.9[242422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:26:04 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:04 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:26:05 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:26:05 localhost python3.9[242510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579563.757594-2546-4685911450116/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 20 04:26:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:26:05.984 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:26:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:26:05.988 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:26:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:26:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28721 DF PROTO=TCP SPT=37896 DPT=9101 SEQ=1703451543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59989EE50000000001030307) Feb 20 04:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:06 localhost python3.9[242620]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:08 localhost python3.9[242730]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.606 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.633 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.634 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:08 localhost nova_compute[230552]: 2026-02-20 09:26:08.637 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:08 localhost python3.9[242840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:26:09 localhost python3.9[242897]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.pg83jp83 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28723 DF PROTO=TCP SPT=37896 DPT=9101 SEQ=1703451543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998AAE80000000001030307) Feb 20 04:26:09 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:26:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:26:09 localhost systemd[1]: var-lib-containers-storage-overlay-bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5-merged.mount: Deactivated successfully. Feb 20 04:26:09 localhost podman[242953]: 2026-02-20 09:26:09.686673687 +0000 UTC m=+0.069463567 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:26:09 localhost podman[242953]: 2026-02-20 09:26:09.695700157 +0000 UTC m=+0.078490047 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:26:09 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:26:09 localhost python3.9[243028]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:10 localhost systemd[1]: var-lib-containers-storage-overlay-bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5-merged.mount: Deactivated successfully. Feb 20 04:26:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31638 DF PROTO=TCP SPT=54660 DPT=9105 SEQ=2177006928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998B5680000000001030307) Feb 20 04:26:12 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:26:12 localhost systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully. Feb 20 04:26:13 localhost nova_compute[230552]: 2026-02-20 09:26:13.635 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:14 localhost systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully. Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.366 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.366 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.385 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.385 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.386 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.470 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.470 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.470 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.471 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:26:14 localhost nova_compute[230552]: 2026-02-20 09:26:14.946 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.000 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.000 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.015 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.015 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.015 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.016 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.016 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:26:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:26:15 localhost systemd[1]: tmp-crun.UG6X66.mount: Deactivated successfully. Feb 20 04:26:15 localhost podman[243353]: 2026-02-20 09:26:15.307672718 +0000 UTC m=+0.089667939 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:26:15 localhost podman[243353]: 2026-02-20 09:26:15.316893914 +0000 UTC m=+0.098889085 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:26:15 localhost python3.9[243354]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.450 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.496 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.496 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.663 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.664 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12644MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.665 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.665 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:26:15 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.759 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.759 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.759 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:26:15 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:26:15 localhost nova_compute[230552]: 2026-02-20 09:26:15.820 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:26:16 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:26:16 localhost podman[243373]: 2026-02-20 09:26:16.084977162 +0000 UTC m=+0.324707547 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 20 04:26:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36127 DF PROTO=TCP SPT=44482 DPT=9102 SEQ=170331392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998C5680000000001030307) Feb 20 04:26:16 localhost podman[243373]: 2026-02-20 09:26:16.180040395 +0000 UTC m=+0.419770770 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:26:16 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:26:16 localhost nova_compute[230552]: 2026-02-20 09:26:16.320 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:26:16 localhost nova_compute[230552]: 2026-02-20 09:26:16.328 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:26:16 localhost nova_compute[230552]: 2026-02-20 09:26:16.348 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:26:16 localhost nova_compute[230552]: 2026-02-20 09:26:16.350 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:26:16 localhost nova_compute[230552]: 2026-02-20 09:26:16.351 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:26:16 localhost python3.9[243528]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:26:16 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:16 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:17 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:17 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:26:17 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:26:17 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32882 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998CB6A0000000001030307) Feb 20 04:26:17 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:17 localhost podman[243578]: 2026-02-20 09:26:17.73841999 +0000 UTC m=+0.091973676 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible) Feb 20 04:26:17 localhost podman[243578]: 2026-02-20 09:26:17.776012955 +0000 UTC m=+0.129566641 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute) Feb 20 04:26:17 localhost podman[243578]: unhealthy Feb 20 04:26:18 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:26:18 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'. Feb 20 04:26:18 localhost python3[243656]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.675 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.675 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.675 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.676 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.676 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:18 localhost nova_compute[230552]: 2026-02-20 09:26:18.678 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:20 localhost systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully. Feb 20 04:26:20 localhost systemd[1]: var-lib-containers-storage-overlay-c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d-merged.mount: Deactivated successfully. Feb 20 04:26:20 localhost systemd[1]: var-lib-containers-storage-overlay-c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d-merged.mount: Deactivated successfully. Feb 20 04:26:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32884 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998D7680000000001030307) Feb 20 04:26:22 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:26:22 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:26:22 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:26:23 localhost nova_compute[230552]: 2026-02-20 09:26:23.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:23 localhost nova_compute[230552]: 2026-02-20 09:26:23.682 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:23 localhost nova_compute[230552]: 2026-02-20 09:26:23.682 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:23 localhost nova_compute[230552]: 2026-02-20 09:26:23.682 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:23 localhost nova_compute[230552]: 2026-02-20 09:26:23.721 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:23 localhost nova_compute[230552]: 2026-02-20 09:26:23.722 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:24 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:24 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:26:24 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:26:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32885 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998E7280000000001030307) Feb 20 04:26:26 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:26 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:26 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:26 localhost podman[243672]: 2026-02-20 09:26:20.421698333 +0000 UTC m=+0.046798782 image pull quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c Feb 20 04:26:27 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:27 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40765 DF PROTO=TCP SPT=54628 DPT=9105 SEQ=2569274226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998F2280000000001030307) Feb 20 04:26:27 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:28 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 20 04:26:28 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 20 04:26:28 localhost nova_compute[230552]: 2026-02-20 09:26:28.722 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:28 localhost nova_compute[230552]: 2026-02-20 09:26:28.724 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40766 DF PROTO=TCP SPT=54628 DPT=9105 SEQ=2569274226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998FA290000000001030307) Feb 20 04:26:30 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:26:30 localhost systemd[1]: var-lib-containers-storage-overlay-fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260-merged.mount: Deactivated successfully. Feb 20 04:26:30 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 20 04:26:30 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 20 04:26:30 localhost podman[243732]: 2026-02-20 09:26:28.137945189 +0000 UTC m=+0.050716845 image pull quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c Feb 20 04:26:31 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 20 04:26:31 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 20 04:26:31 localhost podman[243732]: Feb 20 04:26:31 localhost podman[243732]: 2026-02-20 09:26:31.368435171 +0000 UTC m=+3.281206777 container create 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, version=9.7, config_id=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:26:32 localhost python3[243656]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c Feb 20 04:26:32 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:32 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:32 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32886 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599907680000000001030307) Feb 20 04:26:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:26:33 localhost nova_compute[230552]: 2026-02-20 09:26:33.725 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:33 localhost nova_compute[230552]: 2026-02-20 09:26:33.727 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:33 localhost nova_compute[230552]: 2026-02-20 09:26:33.728 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:33 localhost nova_compute[230552]: 2026-02-20 09:26:33.728 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:33 localhost nova_compute[230552]: 2026-02-20 09:26:33.780 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:33 localhost nova_compute[230552]: 2026-02-20 09:26:33.781 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:34 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 20 04:26:34 localhost systemd[1]: var-lib-containers-storage-overlay-53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c-merged.mount: Deactivated successfully. Feb 20 04:26:34 localhost podman[243768]: 2026-02-20 09:26:34.122005813 +0000 UTC m=+0.828063858 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:26:34 localhost podman[243768]: 2026-02-20 09:26:34.157011074 +0000 UTC m=+0.863069169 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:26:34 localhost podman[243768]: unhealthy Feb 20 04:26:35 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:26:35 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 20 04:26:35 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 20 04:26:35 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:26:35 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:26:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18328 DF PROTO=TCP SPT=57176 DPT=9101 SEQ=3187546761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599914140000000001030307) Feb 20 04:26:36 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:36 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:26:36 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:26:37 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:37 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:37 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:38 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:38 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:38 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:38 localhost nova_compute[230552]: 2026-02-20 09:26:38.782 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:38 localhost nova_compute[230552]: 2026-02-20 09:26:38.784 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:38 localhost nova_compute[230552]: 2026-02-20 09:26:38.784 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:38 localhost nova_compute[230552]: 2026-02-20 09:26:38.785 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:38 localhost nova_compute[230552]: 2026-02-20 09:26:38.814 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:38 localhost nova_compute[230552]: 2026-02-20 09:26:38.815 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18330 DF PROTO=TCP SPT=57176 DPT=9101 SEQ=3187546761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599920290000000001030307) Feb 20 04:26:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:26:40 localhost podman[243808]: 2026-02-20 09:26:40.144358183 +0000 UTC m=+0.081168834 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:26:40 localhost podman[243808]: 2026-02-20 09:26:40.153201439 +0000 UTC m=+0.090012120 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:26:40 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 20 04:26:40 localhost systemd[1]: var-lib-containers-storage-overlay-271fbe47d50a90f03735a26a1ff5b20e2027c13cb6e9d5c8a6a9112793cd7c92-merged.mount: Deactivated successfully. Feb 20 04:26:40 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:26:41 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:41 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 20 04:26:41 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:41 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40768 DF PROTO=TCP SPT=54628 DPT=9105 SEQ=2569274226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599929680000000001030307) Feb 20 04:26:41 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:41 localhost sshd[243924]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:26:42 localhost python3.9[243923]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:26:42 localhost python3.9[244037]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:42 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 20 04:26:42 localhost systemd[1]: var-lib-containers-storage-overlay-0b03ed83be81af8ca31d355d34bc84741adbeedeb0b33580fe27349115e799d7-merged.mount: Deactivated successfully. Feb 20 04:26:43 localhost python3.9[244092]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:26:43 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 20 04:26:43 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 20 04:26:43 localhost nova_compute[230552]: 2026-02-20 09:26:43.816 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:43 localhost nova_compute[230552]: 2026-02-20 09:26:43.818 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:43 localhost nova_compute[230552]: 2026-02-20 09:26:43.818 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:43 localhost nova_compute[230552]: 2026-02-20 09:26:43.818 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:43 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 20 04:26:43 localhost nova_compute[230552]: 2026-02-20 09:26:43.860 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:43 localhost nova_compute[230552]: 2026-02-20 09:26:43.861 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:43 localhost python3.9[244201]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579603.3670156-2882-133781798226221/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:44 localhost python3.9[244256]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:26:44 localhost systemd[1]: Reloading. Feb 20 04:26:44 localhost systemd-rc-local-generator[244284]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:26:44 localhost systemd-sysv-generator[244287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:44 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 20 04:26:44 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:45 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:45 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:45 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:45 localhost python3.9[244347]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:26:45 localhost systemd[1]: Reloading. Feb 20 04:26:45 localhost systemd-sysv-generator[244375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:26:45 localhost systemd-rc-local-generator[244372]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:26:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9499 DF PROTO=TCP SPT=45642 DPT=9102 SEQ=2999730602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599939680000000001030307) Feb 20 04:26:45 localhost systemd[1]: Starting openstack_network_exporter container... Feb 20 04:26:46 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 20 04:26:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:26:46 localhost systemd[1]: var-lib-containers-storage-overlay-33265afbb0ab1192cc35fd8be9e517c4969c8f23f7a1676738a90556ed12fe7c-merged.mount: Deactivated successfully. Feb 20 04:26:46 localhost systemd[1]: Started libcrun container. Feb 20 04:26:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0619ed2bc9c42056f6c58d6012f63756e6012dafd4c93436ed775c2fbc752107/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 20 04:26:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0619ed2bc9c42056f6c58d6012f63756e6012dafd4c93436ed775c2fbc752107/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff) Feb 20 04:26:46 localhost podman[244399]: 2026-02-20 09:26:46.514230069 +0000 UTC m=+0.109641681 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:26:46 localhost podman[244399]: 2026-02-20 09:26:46.521793755 +0000 UTC m=+0.117205377 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:26:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:26:46 localhost podman[244387]: 2026-02-20 09:26:46.557185091 +0000 UTC m=+0.586030566 container init 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9) Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *bridge.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *coverage.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *datapath.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *iface.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *memory.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *ovn.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *pmd_perf.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *pmd_rxq.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: INFO 09:26:46 main.go:48: registering *vswitch.Collector Feb 20 04:26:46 localhost openstack_network_exporter[244414]: NOTICE 09:26:46 main.go:82: listening on http://:9105/metrics Feb 20 04:26:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:26:46 localhost podman[244387]: 2026-02-20 09:26:46.585294597 +0000 UTC m=+0.614140062 container start 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Feb 20 04:26:46 localhost podman[244387]: openstack_network_exporter Feb 20 04:26:47 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:47 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:26:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:26:47 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:26:47 localhost systemd[1]: Started openstack_network_exporter container. Feb 20 04:26:47 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:26:47 localhost podman[244429]: 2026-02-20 09:26:47.156451826 +0000 UTC m=+0.566145593 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=starting, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:26:47 localhost podman[244429]: 2026-02-20 09:26:47.188120359 +0000 UTC m=+0.597814106 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Feb 20 04:26:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5084 DF PROTO=TCP SPT=44340 DPT=9882 SEQ=2692118219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999409B0000000001030307) Feb 20 04:26:47 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:26:47 localhost podman[244440]: 2026-02-20 09:26:47.823808388 +0000 UTC m=+0.718940560 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller) Feb 20 04:26:47 localhost podman[244440]: 2026-02-20 09:26:47.900813639 +0000 UTC m=+0.795945871 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:26:48 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:48 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:26:48 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:26:48 localhost podman[244493]: 2026-02-20 09:26:48.210501096 +0000 UTC m=+0.066753312 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:26:48 localhost podman[244493]: 2026-02-20 09:26:48.21797187 +0000 UTC m=+0.074224116 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:26:48 localhost podman[244493]: unhealthy Feb 20 04:26:48 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:26:48 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'. Feb 20 04:26:48 localhost nova_compute[230552]: 2026-02-20 09:26:48.861 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:48 localhost nova_compute[230552]: 2026-02-20 09:26:48.863 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:48 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:26:48 localhost systemd[1]: var-lib-containers-storage-overlay-66b4607051ec4b678b98370429ea66c5b0f53009a9a85441acbc9ac68d517903-merged.mount: Deactivated successfully. Feb 20 04:26:50 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 20 04:26:50 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 20 04:26:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5086 DF PROTO=TCP SPT=44340 DPT=9882 SEQ=2692118219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59994CA80000000001030307) Feb 20 04:26:50 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 20 04:26:51 localhost python3.9[244601]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:26:52 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:26:52 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 20 04:26:52 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost python3.9[244711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:26:53 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost python3.9[244801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579612.5429497-3018-118334797304688/.source.yaml _original_basename=.zyvf5_1v follow=False checksum=3d9c806251215c5317a47411279e51c792f2fd64 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:26:53 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:26:53 localhost nova_compute[230552]: 2026-02-20 09:26:53.865 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:53 localhost nova_compute[230552]: 2026-02-20 09:26:53.867 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:53 localhost nova_compute[230552]: 2026-02-20 09:26:53.868 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:53 localhost nova_compute[230552]: 2026-02-20 09:26:53.868 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:53 localhost nova_compute[230552]: 2026-02-20 09:26:53.893 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:53 localhost nova_compute[230552]: 2026-02-20 09:26:53.893 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:54 localhost python3.9[244911]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:26:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5087 DF PROTO=TCP SPT=44340 DPT=9882 SEQ=2692118219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59995C680000000001030307) Feb 20 04:26:55 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 20 04:26:57 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:57 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 20 04:26:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33357 DF PROTO=TCP SPT=40374 DPT=9105 SEQ=522897056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599967280000000001030307) Feb 20 04:26:58 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:58 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:58 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:26:58 localhost nova_compute[230552]: 2026-02-20 09:26:58.894 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:58 localhost nova_compute[230552]: 2026-02-20 09:26:58.896 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:26:58 localhost nova_compute[230552]: 2026-02-20 09:26:58.896 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:26:58 localhost nova_compute[230552]: 2026-02-20 09:26:58.896 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:58 localhost nova_compute[230552]: 2026-02-20 09:26:58.916 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:26:58 localhost nova_compute[230552]: 2026-02-20 09:26:58.917 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:26:59 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:26:59 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:59 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:26:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33358 DF PROTO=TCP SPT=40374 DPT=9105 SEQ=522897056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59996F280000000001030307) Feb 20 04:27:00 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 20 04:27:00 localhost systemd[1]: var-lib-containers-storage-overlay-bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a-merged.mount: Deactivated successfully. Feb 20 04:27:00 localhost systemd[1]: var-lib-containers-storage-overlay-bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a-merged.mount: Deactivated successfully. Feb 20 04:27:01 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:01 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 20 04:27:01 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 20 04:27:01 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:02 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:02 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51020 DF PROTO=TCP SPT=60852 DPT=9100 SEQ=2056959636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59997B690000000001030307) Feb 20 04:27:03 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 20 04:27:03 localhost systemd[1]: var-lib-containers-storage-overlay-0b4a27664720ce930aee8034c0e3a2e981bce86564061fc7e3c5cc60116ab629-merged.mount: Deactivated successfully. Feb 20 04:27:03 localhost nova_compute[230552]: 2026-02-20 09:27:03.917 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:03 localhost nova_compute[230552]: 2026-02-20 09:27:03.920 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:03 localhost nova_compute[230552]: 2026-02-20 09:27:03.920 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:03 localhost nova_compute[230552]: 2026-02-20 09:27:03.920 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:03 localhost nova_compute[230552]: 2026-02-20 09:27:03.962 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:03 localhost nova_compute[230552]: 2026-02-20 09:27:03.963 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:05 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:05 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:27:05 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:27:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:27:05 localhost podman[245015]: 2026-02-20 09:27:05.681260634 +0000 UTC m=+0.102410115 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:27:05 localhost podman[245015]: 2026-02-20 09:27:05.691958926 +0000 UTC m=+0.113108387 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:27:05 localhost podman[245015]: unhealthy Feb 20 04:27:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:27:05.985 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:27:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:27:05.986 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:27:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:27:05.987 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:27:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46618 DF PROTO=TCP SPT=41488 DPT=9101 SEQ=1518676747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599989440000000001030307) Feb 20 04:27:07 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:07 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:07 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:07 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:27:07 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:27:08 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:08 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:08 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:08 localhost nova_compute[230552]: 2026-02-20 09:27:08.963 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:08 localhost nova_compute[230552]: 2026-02-20 09:27:08.965 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:08 localhost nova_compute[230552]: 2026-02-20 09:27:08.965 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:08 localhost nova_compute[230552]: 2026-02-20 09:27:08.965 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:08 localhost nova_compute[230552]: 2026-02-20 09:27:08.995 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:08 localhost nova_compute[230552]: 2026-02-20 09:27:08.995 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:09 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:09 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46620 DF PROTO=TCP SPT=41488 DPT=9101 SEQ=1518676747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599995680000000001030307) Feb 20 04:27:09 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:27:11 localhost podman[245039]: 2026-02-20 09:27:11.150393578 +0000 UTC m=+0.084026453 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:27:11 localhost podman[245039]: 2026-02-20 09:27:11.158840059 +0000 UTC m=+0.092472914 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:27:11 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:27:11 localhost systemd[1]: var-lib-containers-storage-overlay-63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e-merged.mount: Deactivated successfully. Feb 20 04:27:11 localhost systemd[1]: var-lib-containers-storage-overlay-63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e-merged.mount: Deactivated successfully. Feb 20 04:27:11 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:27:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33360 DF PROTO=TCP SPT=40374 DPT=9105 SEQ=522897056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59999F690000000001030307) Feb 20 04:27:12 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:12 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 20 04:27:12 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 20 04:27:13 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:13 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:13 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:14 localhost nova_compute[230552]: 2026-02-20 09:27:13.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:14 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 20 04:27:14 localhost systemd[1]: var-lib-containers-storage-overlay-27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7-merged.mount: Deactivated successfully. Feb 20 04:27:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48963 DF PROTO=TCP SPT=37642 DPT=9102 SEQ=2769952032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999AF680000000001030307) Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.353 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.354 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.354 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.354 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.503 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.503 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.503 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:27:16 localhost nova_compute[230552]: 2026-02-20 09:27:16.504 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:27:16 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:27:16 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:27:16 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.040 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.230 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.230 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.231 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.231 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.233 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.233 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.262 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.262 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.263 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.263 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.263 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:27:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57961 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999B5CA0000000001030307) Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.731 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.831 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:27:17 localhost nova_compute[230552]: 2026-02-20 09:27:17.832 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.008 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.009 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12623MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.009 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.009 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.138 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.139 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.139 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:27:18 localhost podman[245082]: 2026-02-20 09:27:18.144756565 +0000 UTC m=+0.080210300 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, maintainer=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:27:18 localhost podman[245082]: 2026-02-20 09:27:18.16000651 +0000 UTC m=+0.095460265 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7) Feb 20 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.190 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.201 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.208 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd627cdf-8f63-493d-b5a7-934b32d197e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.202320', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59c5d24c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '309037857f5827173b9cd85ca4ff641d55498d3bee52b2c262fc0d16259b860d'}]}, 'timestamp': '2026-02-20 09:27:18.209761', '_unique_id': '2cd11d8fad9d4915afa7978b53a4e450'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.212 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.213 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.213 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5c13110-d407-4d0b-b834-b6c4867e5a2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.213393', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59c67a1c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': 'de7d6e230007d60f9b2f787d74afb105ac24a6773d2ff3af72e8756683cad9d5'}]}, 'timestamp': '2026-02-20 09:27:18.213960', '_unique_id': 'e45c3d8041cc4aabb33ec713e81f5647'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.216 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.235 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7aab82c-9de1-4041-953b-e9918ec3ed74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:27:18.216406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '59c9cffa-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.474230558, 'message_signature': 'd3b916a30ab10980606c4abcfe898c8ccbb3666b99d2387ac496df7102c2b739'}]}, 'timestamp': '2026-02-20 09:27:18.235757', '_unique_id': 'b618959fdf664225af5321afdfc07cf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.237 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac223414-02e4-44cd-a922-275fecc1c479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.237495', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59ca2310-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '331ec437af9a2897a73d8c9e796c2c6e5792f5b2140128834f2ca5a7178b7aa7'}]}, 'timestamp': '2026-02-20 09:27:18.237828', '_unique_id': '11c733e834a74cf8a2c028c4b7c75637'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.239 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.239 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d650d67-1f3f-4ff8-b805-328026ad48c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.239288', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59ca67d0-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '8d6f0e38bc343f6eff198b47dc761ae0c2180bd56d1d5e8ffb4c517265c9b3ea'}]}, 'timestamp': '2026-02-20 09:27:18.239581', '_unique_id': 'de33689b16a64982a5fda0afad9b3e78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.254 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f023142-829d-4906-95e3-a554b108049d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.240947', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59ccc35e-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': '9011ea0603f0977ad8339c522d19ebed4561b58a306bdea0dda5eca373f40d15'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.240947', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59ccd042-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': 'b0c578147ff09aa3afb3f6e2e3f4dea2d4ae021177ff93a220832feacb19f53a'}]}, 'timestamp': '2026-02-20 09:27:18.255382', '_unique_id': 'c49c119b5c4f4dee94a5a7d9d8f71ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.257 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29109ce3-f454-445c-be59-52632c2af9f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.257155', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59cd229a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': 'ac840176ece93bf0f6a087d6f47e858ace16d0732475c0f839776579fd521ba4'}]}, 'timestamp': '2026-02-20 09:27:18.257494', '_unique_id': '0191877b273c4805b82171244960d0e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fea2984-91bd-4024-92f4-bf449d63c4ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.258955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d29f36-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '5fc7641cd88c7640d136651ad577186ffb1b6868db81f9dade182e849ad692bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.258955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d2b1ce-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '23984be8de04affb061c35133ef231f7c108317b3567bbc966839e085e442af0'}]}, 'timestamp': '2026-02-20 09:27:18.293966', '_unique_id': '96ccd1a65ef04f84a3d2888a612d83b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b237f7a-9fa9-44a6-ad5e-0fee599749f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.296677', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d32d20-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '19cf2324b2d148f376f6f35e5ea360c6e465c19e111a6209f70f685fa86883be'}]}, 'timestamp': '2026-02-20 09:27:18.297123', '_unique_id': 'f040b331336049d8a4b99a69ded2f5a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c44ea349-9eac-4b3e-acbf-c96abccb2bb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.299124', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d38c02-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '3f5f985857978e1aab639e0e5e7518208e2d2cf57546d94f54040a082cd52e98'}]}, 'timestamp': '2026-02-20 09:27:18.299562', '_unique_id': '19ff01a33daa487a94c601eb17e0ab0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82581654-2d79-4a51-a62e-67469877044c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.301520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d3ea62-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '1cc3a57ad581ea60a6d6514ecb0359c40428101ee78253a5549c1e8de92974db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.301520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d3f980-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '6a2e5003efeb6780417f4610337113a33315d353ede1ad61fd501345baf77303'}]}, 'timestamp': '2026-02-20 09:27:18.302349', '_unique_id': '0f4f409ec65d404d82c93d5d5c7eb0de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 58020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71808694-8773-4e27-b25d-4ca92bf38f49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58020000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:27:18.304413', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '59d45baa-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.474230558, 'message_signature': '64ea9c09b6c2d30613aa86af26ab7172c69d9bc7f5b450d2d2362f6e427b6a1e'}]}, 'timestamp': '2026-02-20 09:27:18.304871', '_unique_id': '721df51cf25d41f3801ed1f67ab80148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7e79424-c2c3-4dc2-af0d-dcd31aaf2ef0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.306808', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d4b8f2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '33fad4e97a17e9508a37074846fe6f799866a2c9200a6a34222da47726806654'}]}, 'timestamp': '2026-02-20 09:27:18.307265', '_unique_id': 'c0a76d4c3d654be69e3fb04003b7480b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b7297f2-63bf-4e14-a202-f9885054cb0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.309209', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d515cc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '9459c5d8a1e1c7598e099c097ebcf8d349308d20932761543b285f0833298244'}]}, 'timestamp': '2026-02-20 09:27:18.309662', '_unique_id': '0f2aa3b126894347b9fbc30009fff9ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5c97103-f308-4be3-a2fd-3e29615e5767', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.311744', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d578d2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '55f0de0dde6251047b7b54fda4107edb994eab13e911dcb7b2b1a34a17524db2'}]}, 'timestamp': '2026-02-20 09:27:18.312192', '_unique_id': '2f5319f7e5ce47beb92054d7e83e0fca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c734bf9a-fd7b-4e6f-a8c1-3566297b229a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.314126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d5d5d4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': '081d8a3c21c90c39c679ff0460798fe8b9bdd823706886a0762137754b6157df'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.314126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d5e696-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': 'b90b4fb3d5ca77b2600caae8d9f3c6023ae6e5173c998f6e41a5a69dfad17846'}]}, 'timestamp': '2026-02-20 09:27:18.314987', '_unique_id': '0804290f14f34abb8ea14a1053486795'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '158e9dc0-983f-4702-9cc6-f4db2c78b937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.316970', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d644ce-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '47f5a2c1967bd92f4eebb42a667a696c9d26431f01c583c099745cc90f8b53f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.316970', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d6534c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '87653bc174a9c981daf0ed3a3bdcb7f98a0c3dd3a269196be0c4d302625c9db1'}]}, 'timestamp': '2026-02-20 09:27:18.317755', '_unique_id': '91612bf374fb45219191819d51412194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c4d87e5-38db-40e5-8d86-117e50d0a624', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.319656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d6ae0a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '9f0311c74d1e85c1a7188c7b210715b317023d0c260fc6dca34259646ee99a27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.319656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d6bca6-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '89e0d38d0e94c9ac7e50bbebe4bb88c660e2f13ef970ed96610da15e39e46a8c'}]}, 'timestamp': '2026-02-20 09:27:18.320428', '_unique_id': 'c7c9c63c4dda40d1b542fc3fd1df1323'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2e5482e-b343-48c6-82e3-b7e38019c4c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.322216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d71098-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': 'ea806c8ef6dfbec3dcc1ea81f26a8db2a3c28c141174b0bcab9be8192ac6957c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.322216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d71ee4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': '7165b4860b750db63c83ce20dccb18ad47d345380b58dfbe0bdb5e037e9cc0c5'}]}, 'timestamp': '2026-02-20 09:27:18.322959', '_unique_id': '958a4da6c3444da48ccbd9a04fe94c3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.324 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f722bcaf-3832-4652-a448-8c58c74e7002', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.324755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d773f8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '44b546b6457768c935532a91665af51c6da4386910322a9783bbf68954ef00fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.324755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d780dc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '0ee06d2b8c3a9e75d15c3b2847be7563474589db7e19c7706c7faa2cb383e71a'}]}, 'timestamp': '2026-02-20 09:27:18.325441', '_unique_id': '52686bf7685741659194284ee0c5ef40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcc4e981-52c8-4dca-bf82-44e1fb3914df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.327441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d7db9a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '8d65c4baf20bea04dead104259f8d92f025874f0fd0336a72b7075bfdeb7abfb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.327441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d7e752-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '6f3744c0b84d0a0894500e5d814ce36f6413cacc38ea25f6fac1fbf790fdc18b'}]}, 'timestamp': '2026-02-20 09:27:18.328031', '_unique_id': 'cd3f68b954a049aaa780bb8aff26bf4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:27:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:27:18 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.642 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.648 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.666 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.669 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:27:18 localhost nova_compute[230552]: 2026-02-20 09:27:18.669 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:27:18 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:27:18 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:27:19 localhost nova_compute[230552]: 2026-02-20 09:27:18.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:19 localhost nova_compute[230552]: 2026-02-20 09:27:19.002 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:19 localhost nova_compute[230552]: 2026-02-20 09:27:19.002 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:19 localhost nova_compute[230552]: 2026-02-20 09:27:19.002 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:19 localhost nova_compute[230552]: 2026-02-20 09:27:19.042 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:19 localhost nova_compute[230552]: 2026-02-20 09:27:19.043 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:19 localhost podman[245083]: 2026-02-20 09:27:19.054273062 +0000 UTC m=+0.988139343 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:27:19 localhost podman[245111]: 2026-02-20 09:27:19.081117144 +0000 UTC m=+0.896149734 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:27:19 localhost podman[245111]: 2026-02-20 09:27:19.126448901 +0000 UTC m=+0.941481451 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 20 04:27:19 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:27:19 localhost podman[245083]: 2026-02-20 09:27:19.141492092 +0000 UTC m=+1.075358363 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:27:19 localhost podman[245144]: 2026-02-20 09:27:19.214561395 +0000 UTC m=+0.253747143 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 20 04:27:19 localhost podman[245144]: 2026-02-20 09:27:19.243502964 +0000 UTC m=+0.282688722 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:27:19 localhost podman[245144]: unhealthy Feb 20 04:27:19 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:20 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:20 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:20 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:27:20 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:27:20 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:27:20 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'. Feb 20 04:27:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57963 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999C1E90000000001030307) Feb 20 04:27:20 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:21 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:21 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:21 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:23 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:27:24 localhost nova_compute[230552]: 2026-02-20 09:27:24.043 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:24 localhost nova_compute[230552]: 2026-02-20 09:27:24.046 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:24 localhost nova_compute[230552]: 2026-02-20 09:27:24.046 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:24 localhost nova_compute[230552]: 2026-02-20 09:27:24.046 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:24 localhost nova_compute[230552]: 2026-02-20 09:27:24.084 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:24 localhost nova_compute[230552]: 2026-02-20 09:27:24.085 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:24 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:24 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 20 04:27:24 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 20 04:27:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57964 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999D1A80000000001030307) Feb 20 04:27:25 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:25 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:25 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:26 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:26 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:26 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59108 DF PROTO=TCP SPT=51124 DPT=9105 SEQ=692478070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999DC690000000001030307) Feb 20 04:27:27 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 20 04:27:27 localhost systemd[1]: var-lib-containers-storage-overlay-6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d-merged.mount: Deactivated successfully. Feb 20 04:27:29 localhost nova_compute[230552]: 2026-02-20 09:27:29.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:29 localhost nova_compute[230552]: 2026-02-20 09:27:29.088 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:29 localhost nova_compute[230552]: 2026-02-20 09:27:29.088 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:29 localhost nova_compute[230552]: 2026-02-20 09:27:29.089 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:29 localhost nova_compute[230552]: 2026-02-20 09:27:29.121 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:29 localhost nova_compute[230552]: 2026-02-20 09:27:29.122 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59109 DF PROTO=TCP SPT=51124 DPT=9105 SEQ=692478070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999E4680000000001030307) Feb 20 04:27:30 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:27:30 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:27:30 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:27:31 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:31 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:27:32 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:27:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57965 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999F1680000000001030307) Feb 20 04:27:33 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:33 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:33 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:33 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:33 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:34 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:34 localhost nova_compute[230552]: 2026-02-20 09:27:34.123 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:34 localhost nova_compute[230552]: 2026-02-20 09:27:34.125 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:34 localhost nova_compute[230552]: 2026-02-20 09:27:34.125 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:34 localhost nova_compute[230552]: 2026-02-20 09:27:34.125 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:34 localhost nova_compute[230552]: 2026-02-20 09:27:34.170 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:34 localhost nova_compute[230552]: 2026-02-20 09:27:34.171 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11280 DF PROTO=TCP SPT=43356 DPT=9101 SEQ=3933118052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999FE740000000001030307) Feb 20 04:27:36 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:27:36 localhost systemd[1]: var-lib-containers-storage-overlay-ef9cc1375c4a3e979779fde9a22d44caa1f8d54d9be8e432ea85c98c54294ad4-merged.mount: Deactivated successfully. Feb 20 04:27:36 localhost systemd[1]: var-lib-containers-storage-overlay-ef9cc1375c4a3e979779fde9a22d44caa1f8d54d9be8e432ea85c98c54294ad4-merged.mount: Deactivated successfully. Feb 20 04:27:37 localhost systemd[1]: var-lib-containers-storage-overlay-ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2-merged.mount: Deactivated successfully. Feb 20 04:27:37 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:27:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:27:37 localhost systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully. Feb 20 04:27:37 localhost podman[245180]: 2026-02-20 09:27:37.955782015 +0000 UTC m=+0.071556973 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:27:37 localhost podman[245180]: 2026-02-20 09:27:37.99008738 +0000 UTC m=+0.105862318 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:27:37 localhost podman[245180]: unhealthy Feb 20 04:27:38 localhost systemd[1]: tmp-crun.sB1Gf2.mount: Deactivated successfully. Feb 20 04:27:38 localhost systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully. Feb 20 04:27:38 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:27:38 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:27:38 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:27:38 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:27:39 localhost nova_compute[230552]: 2026-02-20 09:27:39.172 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:39 localhost nova_compute[230552]: 2026-02-20 09:27:39.174 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:39 localhost nova_compute[230552]: 2026-02-20 09:27:39.174 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:39 localhost nova_compute[230552]: 2026-02-20 09:27:39.175 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:39 localhost nova_compute[230552]: 2026-02-20 09:27:39.207 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:39 localhost nova_compute[230552]: 2026-02-20 09:27:39.209 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:39 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 20 04:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11282 DF PROTO=TCP SPT=43356 DPT=9101 SEQ=3933118052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A0A680000000001030307) Feb 20 04:27:40 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:40 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:27:40 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:27:40 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:40 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:40 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:41 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:41 localhost sshd[245203]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:27:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:27:42 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:27:42 localhost podman[245205]: 2026-02-20 09:27:42.133814023 +0000 UTC m=+0.078461831 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:27:42 localhost podman[245205]: 2026-02-20 09:27:42.144981988 +0000 UTC m=+0.089629776 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:27:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59111 DF PROTO=TCP SPT=51124 DPT=9105 SEQ=692478070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A15680000000001030307) Feb 20 04:27:42 localhost systemd[1]: var-lib-containers-storage-overlay-17d67d7c6c3046ba2041c4048263641e426665d92e1e8fa18e3c871ca9222f66-merged.mount: Deactivated successfully. Feb 20 04:27:44 localhost nova_compute[230552]: 2026-02-20 09:27:44.209 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:44 localhost nova_compute[230552]: 2026-02-20 09:27:44.211 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:44 localhost nova_compute[230552]: 2026-02-20 09:27:44.211 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:44 localhost nova_compute[230552]: 2026-02-20 09:27:44.211 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:44 localhost nova_compute[230552]: 2026-02-20 09:27:44.212 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:44 localhost nova_compute[230552]: 2026-02-20 09:27:44.212 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:45 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:45 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:27:45 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:27:45 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:27:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8512 DF PROTO=TCP SPT=48940 DPT=9100 SEQ=4107317744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A25680000000001030307) Feb 20 04:27:47 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:47 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:47 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38848 DF PROTO=TCP SPT=46946 DPT=9882 SEQ=1065721462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A2AFA0000000001030307) Feb 20 04:27:48 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:48 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:48 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:27:49 localhost podman[245228]: 2026-02-20 09:27:49.151612059 +0000 UTC m=+0.091896458 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, version=9.7, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:27:49 localhost podman[245228]: 2026-02-20 09:27:49.166970579 +0000 UTC m=+0.107254988 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter) Feb 20 04:27:49 localhost nova_compute[230552]: 2026-02-20 09:27:49.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:49 localhost nova_compute[230552]: 2026-02-20 09:27:49.245 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:49 localhost nova_compute[230552]: 2026-02-20 09:27:49.246 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:49 localhost nova_compute[230552]: 2026-02-20 09:27:49.246 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:49 localhost nova_compute[230552]: 2026-02-20 09:27:49.251 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:49 localhost nova_compute[230552]: 2026-02-20 09:27:49.252 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:49 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:27:49 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:49 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:49 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:50 localhost sshd[245248]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:27:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:27:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:27:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:27:50 localhost systemd[1]: tmp-crun.qTS2Zk.mount: Deactivated successfully. Feb 20 04:27:50 localhost podman[245252]: 2026-02-20 09:27:50.658040451 +0000 UTC m=+0.086448046 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:27:50 localhost podman[245252]: 2026-02-20 09:27:50.701990981 +0000 UTC m=+0.130398576 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:27:50 localhost podman[245251]: 2026-02-20 09:27:50.711289501 +0000 UTC m=+0.142622207 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:27:50 localhost podman[245251]: 2026-02-20 09:27:50.744088443 +0000 UTC m=+0.175421119 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute) Feb 20 04:27:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38850 DF PROTO=TCP SPT=46946 DPT=9882 SEQ=1065721462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A36E80000000001030307) Feb 20 04:27:50 localhost podman[245251]: unhealthy Feb 20 04:27:52 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:27:52 localhost systemd[1]: var-lib-containers-storage-overlay-4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3-merged.mount: Deactivated successfully. Feb 20 04:27:52 localhost systemd[1]: var-lib-containers-storage-overlay-4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3-merged.mount: Deactivated successfully. Feb 20 04:27:52 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:27:52 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:27:52 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'. Feb 20 04:27:52 localhost podman[245250]: 2026-02-20 09:27:52.500281312 +0000 UTC m=+1.919710217 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:27:52 localhost podman[245250]: 2026-02-20 09:27:52.584181107 +0000 UTC m=+2.003610002 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 20 04:27:53 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:27:53 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 20 04:27:53 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 20 04:27:53 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:27:54 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.253 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.278 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.278 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:27:54 localhost nova_compute[230552]: 2026-02-20 09:27:54.281 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:54 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:27:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38851 DF PROTO=TCP SPT=46946 DPT=9882 SEQ=1065721462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A46A90000000001030307) Feb 20 04:27:55 localhost systemd[1]: var-lib-containers-storage-overlay-4bb1f8a81ebf31c6df88a84cd13b1c78ab0b7c78b4f247f0212f5208091a25c0-merged.mount: Deactivated successfully. Feb 20 04:27:57 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:57 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:27:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63867 DF PROTO=TCP SPT=39066 DPT=9105 SEQ=239460387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A51A80000000001030307) Feb 20 04:27:57 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:27:59 localhost nova_compute[230552]: 2026-02-20 09:27:59.280 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:59 localhost nova_compute[230552]: 2026-02-20 09:27:59.284 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:27:59 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:27:59 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:27:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63868 DF PROTO=TCP SPT=39066 DPT=9105 SEQ=239460387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A59A80000000001030307) Feb 20 04:27:59 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:00 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:00 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:00 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:01 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:01 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:01 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=171 DF PROTO=TCP SPT=50692 DPT=9102 SEQ=2315604798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A65690000000001030307) Feb 20 04:28:04 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:28:04 localhost systemd[1]: var-lib-containers-storage-overlay-0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865-merged.mount: Deactivated successfully. Feb 20 04:28:04 localhost nova_compute[230552]: 2026-02-20 09:28:04.285 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:04 localhost systemd[1]: var-lib-containers-storage-overlay-0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865-merged.mount: Deactivated successfully. Feb 20 04:28:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:28:05.989 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:28:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:28:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:28:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:28:05.995 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:28:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18399 DF PROTO=TCP SPT=56546 DPT=9101 SEQ=4089080019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A73A50000000001030307) Feb 20 04:28:06 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 20 04:28:06 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 20 04:28:06 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 20 04:28:07 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:28:07 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 20 04:28:08 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 20 04:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:28:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:28:09 localhost podman[245394]: 2026-02-20 09:28:09.131053629 +0000 UTC m=+0.083735521 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:28:09 localhost podman[245394]: 2026-02-20 09:28:09.158470234 +0000 UTC m=+0.111152166 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:28:09 localhost podman[245394]: unhealthy Feb 20 04:28:09 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:28:09 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:28:09 localhost nova_compute[230552]: 2026-02-20 09:28:09.290 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4993-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:09 localhost nova_compute[230552]: 2026-02-20 09:28:09.292 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:09 localhost nova_compute[230552]: 2026-02-20 09:28:09.292 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:09 localhost nova_compute[230552]: 2026-02-20 09:28:09.292 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:09 localhost nova_compute[230552]: 2026-02-20 09:28:09.313 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:09 localhost nova_compute[230552]: 2026-02-20 09:28:09.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:28:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18401 DF PROTO=TCP SPT=56546 DPT=9101 SEQ=4089080019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A7FA90000000001030307) Feb 20 04:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 20 04:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:28:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63870 DF PROTO=TCP SPT=39066 DPT=9105 SEQ=239460387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A89690000000001030307) Feb 20 04:28:11 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 20 04:28:12 localhost systemd[1]: var-lib-containers-storage-overlay-7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf-merged.mount: Deactivated successfully. Feb 20 04:28:12 localhost systemd[1]: var-lib-containers-storage-overlay-7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf-merged.mount: Deactivated successfully. Feb 20 04:28:14 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:14 localhost nova_compute[230552]: 2026-02-20 09:28:14.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:14 localhost nova_compute[230552]: 2026-02-20 09:28:14.317 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:14 localhost nova_compute[230552]: 2026-02-20 09:28:14.317 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:14 localhost nova_compute[230552]: 2026-02-20 09:28:14.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:14 localhost nova_compute[230552]: 2026-02-20 09:28:14.346 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:14 localhost nova_compute[230552]: 2026-02-20 09:28:14.347 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:14 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:28:14 localhost systemd[1]: session-56.scope: Deactivated successfully. Feb 20 04:28:14 localhost systemd[1]: session-56.scope: Consumed 1min 6.945s CPU time. Feb 20 04:28:14 localhost systemd-logind[759]: Session 56 logged out. Waiting for processes to exit. Feb 20 04:28:14 localhost systemd-logind[759]: Removed session 56. Feb 20 04:28:14 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:28:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61176 DF PROTO=TCP SPT=57410 DPT=9100 SEQ=2428030968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A99680000000001030307) Feb 20 04:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:28:16 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:16 localhost systemd[1]: tmp-crun.RPsEsU.mount: Deactivated successfully. Feb 20 04:28:16 localhost podman[245417]: 2026-02-20 09:28:16.176712471 +0000 UTC m=+0.114708507 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:28:16 localhost podman[245417]: 2026-02-20 09:28:16.208675558 +0000 UTC m=+0.146671584 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:28:16 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:16 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:28:17 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:17 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:17 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38508 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AA02B0000000001030307) Feb 20 04:28:18 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:18 localhost nova_compute[230552]: 2026-02-20 09:28:18.610 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:18 localhost nova_compute[230552]: 2026-02-20 09:28:18.610 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:18 localhost nova_compute[230552]: 2026-02-20 09:28:18.636 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:18 localhost nova_compute[230552]: 2026-02-20 09:28:18.637 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:28:18 localhost nova_compute[230552]: 2026-02-20 09:28:18.637 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:28:18 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:18 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.185 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.185 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.185 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.186 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.348 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.350 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.351 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.351 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.388 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.389 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:28:19 localhost systemd[1]: tmp-crun.7NXbzu.mount: Deactivated successfully. Feb 20 04:28:19 localhost podman[245440]: 2026-02-20 09:28:19.636210122 +0000 UTC m=+0.100230375 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public) Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.646 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:28:19 localhost podman[245440]: 2026-02-20 09:28:19.652997336 +0000 UTC m=+0.117017619 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.664 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.665 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.665 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.665 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.666 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.666 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.667 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.667 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.667 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.668 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.683 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.684 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.684 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.684 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:28:19 localhost nova_compute[230552]: 2026-02-20 09:28:19.685 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:28:20 localhost sshd[245479]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.182 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.257 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.258 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.467 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.468 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12499MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.469 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.469 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.568 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.569 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.569 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:28:20 localhost sshd[245482]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:28:20 localhost nova_compute[230552]: 2026-02-20 09:28:20.617 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:28:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38510 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AAC280000000001030307) Feb 20 04:28:21 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:28:21 localhost nova_compute[230552]: 2026-02-20 09:28:21.122 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:28:21 localhost nova_compute[230552]: 2026-02-20 09:28:21.134 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:28:21 localhost systemd[1]: var-lib-containers-storage-overlay-d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de-merged.mount: Deactivated successfully. Feb 20 04:28:21 localhost nova_compute[230552]: 2026-02-20 09:28:21.152 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:28:21 localhost nova_compute[230552]: 2026-02-20 09:28:21.154 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:28:21 localhost nova_compute[230552]: 2026-02-20 09:28:21.155 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:28:21 localhost systemd[1]: var-lib-containers-storage-overlay-d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de-merged.mount: Deactivated successfully. Feb 20 04:28:21 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:28:22 localhost systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully. Feb 20 04:28:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:28:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:28:22 localhost systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully. Feb 20 04:28:22 localhost systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully. Feb 20 04:28:22 localhost podman[245507]: 2026-02-20 09:28:22.690008388 +0000 UTC m=+0.134603376 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 20 04:28:22 localhost podman[245507]: 2026-02-20 09:28:22.698893805 +0000 UTC m=+0.143488813 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 20 04:28:23 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:28:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:28:23 localhost systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully. Feb 20 04:28:23 localhost systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully. Feb 20 04:28:23 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:28:23 localhost podman[245535]: 2026-02-20 09:28:23.988682849 +0000 UTC m=+0.203935848 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:28:24 localhost podman[245535]: 2026-02-20 09:28:24.072096409 +0000 UTC m=+0.287349448 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:28:24 localhost nova_compute[230552]: 2026-02-20 09:28:24.385 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:24 localhost nova_compute[230552]: 2026-02-20 09:28:24.391 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:24 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:28:24 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:28:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38511 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599ABBE80000000001030307) Feb 20 04:28:24 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:28:24 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:28:25 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 20 04:28:25 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:28:25 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:28:26 localhost systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully. Feb 20 04:28:26 localhost podman[245506]: 2026-02-20 09:28:26.854867097 +0000 UTC m=+4.301584392 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 20 04:28:26 localhost podman[245506]: 2026-02-20 09:28:26.88801855 +0000 UTC m=+4.334735835 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:28:26 localhost podman[245506]: unhealthy Feb 20 04:28:27 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 20 04:28:27 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:28:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51025 DF PROTO=TCP SPT=58706 DPT=9105 SEQ=1261460501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AC6E80000000001030307) Feb 20 04:28:27 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:28:27 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:28:27 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'. Feb 20 04:28:28 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 20 04:28:29 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:28:29 localhost nova_compute[230552]: 2026-02-20 09:28:29.392 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51026 DF PROTO=TCP SPT=58706 DPT=9105 SEQ=1261460501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599ACEE90000000001030307) Feb 20 04:28:30 localhost sshd[245566]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:28:31 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:31 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:31 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38512 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599ADB690000000001030307) Feb 20 04:28:33 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:33 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:33 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.393 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.419 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.419 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.420 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.422 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.422 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:34 localhost nova_compute[230552]: 2026-02-20 09:28:34.425 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:34 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:34 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:34 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:35 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:35 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:35 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:35 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55868 DF PROTO=TCP SPT=42486 DPT=9101 SEQ=1402282965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AE8D50000000001030307) Feb 20 04:28:38 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:38 localhost systemd[1]: var-lib-containers-storage-overlay-bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c-merged.mount: Deactivated successfully. Feb 20 04:28:39 localhost sshd[245568]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:28:39 localhost systemd-logind[759]: New session 57 of user zuul. Feb 20 04:28:39 localhost systemd[1]: Started Session 57 of User zuul. Feb 20 04:28:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55870 DF PROTO=TCP SPT=42486 DPT=9101 SEQ=1402282965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AF4E80000000001030307) Feb 20 04:28:39 localhost podman[245570]: 2026-02-20 09:28:39.413376613 +0000 UTC m=+0.110410973 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:28:39 localhost podman[245570]: 2026-02-20 09:28:39.418368148 +0000 UTC m=+0.115402478 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:28:39 localhost podman[245570]: unhealthy Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.446 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.447 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.448 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5022 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.448 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.449 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.449 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:39 localhost nova_compute[230552]: 2026-02-20 09:28:39.451 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:39 localhost python3.9[245687]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman Feb 20 04:28:40 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:40 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:40 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:40 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:28:40 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:28:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51028 DF PROTO=TCP SPT=58706 DPT=9105 SEQ=1261460501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AFF680000000001030307) Feb 20 04:28:42 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:42 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:42 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:43 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:43 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:43 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:44 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.491 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.491 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.491 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:44 localhost nova_compute[230552]: 2026-02-20 09:28:44.500 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:44 localhost python3.9[245810]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:28:44 localhost systemd[1]: Started libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope. Feb 20 04:28:44 localhost podman[245811]: 2026-02-20 09:28:44.771499653 +0000 UTC m=+0.113648303 container exec 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller) Feb 20 04:28:44 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:44 localhost podman[245811]: 2026-02-20 09:28:44.801660904 +0000 UTC m=+0.143809584 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:28:44 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:45 localhost python3.9[245950]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:28:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13461 DF PROTO=TCP SPT=51214 DPT=9102 SEQ=3193409718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B0F680000000001030307) Feb 20 04:28:47 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:28:47 localhost systemd[1]: var-lib-containers-storage-overlay-40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788-merged.mount: Deactivated successfully. Feb 20 04:28:47 localhost systemd[1]: libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope: Deactivated successfully. Feb 20 04:28:47 localhost systemd[1]: Started libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope. Feb 20 04:28:47 localhost podman[245951]: 2026-02-20 09:28:47.297734425 +0000 UTC m=+1.632760993 container exec 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible) Feb 20 04:28:47 localhost podman[245951]: 2026-02-20 09:28:47.332944633 +0000 UTC m=+1.667971241 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:28:47 localhost podman[245962]: 2026-02-20 09:28:47.346756354 +0000 UTC m=+0.281685422 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:28:47 localhost podman[245962]: 2026-02-20 09:28:47.390932771 +0000 UTC m=+0.325861789 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:28:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18703 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B155A0000000001030307) Feb 20 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 20 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 20 04:28:48 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:48 localhost python3.9[246112]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:28:48 localhost systemd[1]: libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope: Deactivated successfully. Feb 20 04:28:49 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:49 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:49 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:49 localhost nova_compute[230552]: 2026-02-20 09:28:49.501 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:49 localhost nova_compute[230552]: 2026-02-20 09:28:49.503 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:49 localhost nova_compute[230552]: 2026-02-20 09:28:49.504 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:49 localhost nova_compute[230552]: 2026-02-20 09:28:49.504 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:49 localhost nova_compute[230552]: 2026-02-20 09:28:49.534 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:49 localhost nova_compute[230552]: 2026-02-20 09:28:49.534 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:49 localhost python3.9[246222]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman Feb 20 04:28:50 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 20 04:28:50 localhost systemd[1]: var-lib-containers-storage-overlay-2f138d9d6c461962e8cf2ee8539c9294af2f13aab0c8b266d53219a78c733e21-merged.mount: Deactivated successfully. Feb 20 04:28:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18705 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B21680000000001030307) Feb 20 04:28:51 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:28:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:28:51 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 20 04:28:51 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 20 04:28:51 localhost podman[246236]: 2026-02-20 09:28:51.756489704 +0000 UTC m=+0.160579206 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:28:51 localhost podman[246236]: 2026-02-20 09:28:51.799222396 +0000 UTC m=+0.203311918 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64) Feb 20 04:28:52 localhost python3.9[246364]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:28:52 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:53 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:28:53 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:28:53 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:28:53 localhost systemd[1]: Started libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope. Feb 20 04:28:53 localhost podman[246365]: 2026-02-20 09:28:53.24434096 +0000 UTC m=+0.880484085 container exec ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:28:53 localhost podman[246365]: 2026-02-20 09:28:53.249232903 +0000 UTC m=+0.885376068 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:28:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:28:54 localhost podman[246394]: 2026-02-20 09:28:54.468375883 +0000 UTC m=+0.405381637 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent) Feb 20 04:28:54 localhost podman[246394]: 2026-02-20 09:28:54.498723469 +0000 UTC m=+0.435729243 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:28:54 localhost nova_compute[230552]: 2026-02-20 09:28:54.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:54 localhost nova_compute[230552]: 2026-02-20 09:28:54.537 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:54 localhost nova_compute[230552]: 2026-02-20 09:28:54.537 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:54 localhost nova_compute[230552]: 2026-02-20 09:28:54.538 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:54 localhost nova_compute[230552]: 2026-02-20 09:28:54.572 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:54 localhost nova_compute[230552]: 2026-02-20 09:28:54.572 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18706 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B31280000000001030307) Feb 20 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:28:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:28:55 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:55 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:28:55 localhost systemd[1]: libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope: Deactivated successfully. Feb 20 04:28:55 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:28:55 localhost podman[246522]: 2026-02-20 09:28:55.125122293 +0000 UTC m=+0.121944032 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:28:55 localhost python3.9[246521]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:28:55 localhost podman[246522]: 2026-02-20 09:28:55.169102774 +0000 UTC m=+0.165924533 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:28:55 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:28:55 localhost systemd[1]: Started libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope. Feb 20 04:28:55 localhost podman[246547]: 2026-02-20 09:28:55.524653769 +0000 UTC m=+0.353325836 container exec ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:28:55 localhost podman[246547]: 2026-02-20 09:28:55.531051005 +0000 UTC m=+0.359723112 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127) Feb 20 04:28:56 localhost sshd[246576]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 20 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340-merged.mount: Deactivated successfully. Feb 20 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340-merged.mount: Deactivated successfully. Feb 20 04:28:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63337 DF PROTO=TCP SPT=39892 DPT=9105 SEQ=3109845429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B3C290000000001030307) Feb 20 04:28:57 localhost python3.9[246687]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:28:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:28:58 localhost python3.9[246808]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman Feb 20 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:28:59 localhost nova_compute[230552]: 2026-02-20 09:28:59.573 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:59 localhost nova_compute[230552]: 2026-02-20 09:28:59.575 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:28:59 localhost nova_compute[230552]: 2026-02-20 09:28:59.576 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:28:59 localhost nova_compute[230552]: 2026-02-20 09:28:59.576 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:59 localhost nova_compute[230552]: 2026-02-20 09:28:59.602 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:28:59 localhost nova_compute[230552]: 2026-02-20 09:28:59.603 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:28:59 localhost systemd[1]: libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope: Deactivated successfully. Feb 20 04:28:59 localhost podman[246751]: 2026-02-20 09:28:59.656919032 +0000 UTC m=+1.587977094 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:28:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63338 DF PROTO=TCP SPT=39892 DPT=9105 SEQ=3109845429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B44280000000001030307) Feb 20 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:01 localhost podman[246751]: 2026-02-20 09:29:01.68113028 +0000 UTC m=+3.612188322 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127) Feb 20 04:29:01 localhost podman[246751]: unhealthy Feb 20 04:29:02 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:02 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:02 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:02 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:29:02 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'. Feb 20 04:29:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18707 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B51680000000001030307) Feb 20 04:29:03 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:04 localhost python3.9[247005]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:04 localhost systemd[1]: Started libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope. Feb 20 04:29:04 localhost podman[247006]: 2026-02-20 09:29:04.30968721 +0000 UTC m=+0.104489924 container exec 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:29:04 localhost podman[247006]: 2026-02-20 09:29:04.340148504 +0000 UTC m=+0.134951238 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 20 04:29:04 localhost nova_compute[230552]: 2026-02-20 09:29:04.604 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:04 localhost nova_compute[230552]: 2026-02-20 09:29:04.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:04 localhost nova_compute[230552]: 2026-02-20 09:29:04.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:04 localhost nova_compute[230552]: 2026-02-20 09:29:04.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:04 localhost nova_compute[230552]: 2026-02-20 09:29:04.637 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:04 localhost nova_compute[230552]: 2026-02-20 09:29:04.638 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:29:05.990 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:29:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:29:05.991 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:29:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:29:05.993 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:29:06 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 20 04:29:06 localhost systemd[1]: var-lib-containers-storage-overlay-5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61-merged.mount: Deactivated successfully. Feb 20 04:29:06 localhost systemd[1]: var-lib-containers-storage-overlay-5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61-merged.mount: Deactivated successfully. Feb 20 04:29:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57224 DF PROTO=TCP SPT=43146 DPT=9101 SEQ=2809445260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B5E050000000001030307) Feb 20 04:29:06 localhost python3.9[247162]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:08 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:08 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:09 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:09 localhost systemd[1]: libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope: Deactivated successfully. Feb 20 04:29:09 localhost systemd[1]: Started libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope. Feb 20 04:29:09 localhost podman[247163]: 2026-02-20 09:29:09.152865107 +0000 UTC m=+2.178629905 container exec 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:29:09 localhost podman[247163]: 2026-02-20 09:29:09.18329048 +0000 UTC m=+2.209055338 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57226 DF PROTO=TCP SPT=43146 DPT=9101 SEQ=2809445260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B6A290000000001030307) Feb 20 04:29:09 localhost nova_compute[230552]: 2026-02-20 09:29:09.638 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:09 localhost nova_compute[230552]: 2026-02-20 09:29:09.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:09 localhost nova_compute[230552]: 2026-02-20 09:29:09.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:09 localhost nova_compute[230552]: 2026-02-20 09:29:09.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:09 localhost nova_compute[230552]: 2026-02-20 09:29:09.687 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:09 localhost nova_compute[230552]: 2026-02-20 09:29:09.688 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:10 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:10 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:29:11 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:11 localhost podman[247192]: 2026-02-20 09:29:11.299619711 +0000 UTC m=+0.234411300 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:29:11 localhost nova_compute[230552]: 2026-02-20 09:29:11.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:11 localhost nova_compute[230552]: 2026-02-20 09:29:11.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 04:29:11 localhost podman[247192]: 2026-02-20 09:29:11.308958812 +0000 UTC m=+0.243750401 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:29:11 localhost podman[247192]: unhealthy Feb 20 04:29:11 localhost nova_compute[230552]: 2026-02-20 09:29:11.329 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 04:29:11 localhost nova_compute[230552]: 2026-02-20 09:29:11.331 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:11 localhost nova_compute[230552]: 2026-02-20 09:29:11.331 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 04:29:11 localhost nova_compute[230552]: 2026-02-20 09:29:11.350 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63340 DF PROTO=TCP SPT=39892 DPT=9105 SEQ=3109845429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B73690000000001030307) Feb 20 04:29:11 localhost python3.9[247323]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:12 localhost systemd[1]: libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope: Deactivated successfully. Feb 20 04:29:12 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:29:12 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:29:12 localhost python3.9[247433]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman Feb 20 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:13 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:13 localhost nova_compute[230552]: 2026-02-20 09:29:13.383 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:14 localhost python3.9[247557]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:14 localhost systemd[1]: Started libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope. Feb 20 04:29:14 localhost podman[247558]: 2026-02-20 09:29:14.21367895 +0000 UTC m=+0.105938782 container exec f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:29:14 localhost podman[247558]: 2026-02-20 09:29:14.246115618 +0000 UTC m=+0.138375470 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.296 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.298 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.317 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.317 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.317 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.318 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.318 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.689 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.691 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.692 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.692 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.732 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.733 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:14 localhost nova_compute[230552]: 2026-02-20 09:29:14.788 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.052 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.052 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.249 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.251 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12417MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.251 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.252 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.352 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.353 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.353 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.398 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.457 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.458 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.485 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.505 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:29:15 localhost nova_compute[230552]: 2026-02-20 09:29:15.537 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:29:15 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54969 DF PROTO=TCP SPT=46126 DPT=9102 SEQ=3036961238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B83680000000001030307) Feb 20 04:29:15 localhost systemd[1]: var-lib-containers-storage-overlay-c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d-merged.mount: Deactivated successfully. Feb 20 04:29:16 localhost nova_compute[230552]: 2026-02-20 09:29:16.006 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:29:16 localhost nova_compute[230552]: 2026-02-20 09:29:16.018 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:29:16 localhost nova_compute[230552]: 2026-02-20 09:29:16.037 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:29:16 localhost nova_compute[230552]: 2026-02-20 09:29:16.041 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:29:16 localhost nova_compute[230552]: 2026-02-20 09:29:16.042 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:29:16 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 20 04:29:16 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:29:17 localhost python3.9[247741]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.043 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.044 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.044 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:29:17 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:29:17 localhost systemd[1]: libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope: Deactivated successfully. Feb 20 04:29:17 localhost systemd[1]: Started libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope. Feb 20 04:29:17 localhost podman[247742]: 2026-02-20 09:29:17.199503547 +0000 UTC m=+0.175480608 container exec f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:29:17 localhost podman[247742]: 2026-02-20 09:29:17.234232659 +0000 UTC m=+0.210209720 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:29:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61962 DF PROTO=TCP SPT=43588 DPT=9882 SEQ=1917522212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B8A8C0000000001030307) Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.886 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.899 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.899 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.900 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.901 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.901 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:29:17 localhost nova_compute[230552]: 2026-02-20 09:29:17.902 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.201 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.201 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.204 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70dd158-b587-4ee7-aedd-f97edd4d0186', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.201866', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a14ba9fc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'eef811acd47218083a7e194ce90146e60a436ab76be28c660b992467c4a802a0'}]}, 'timestamp': '2026-02-20 09:29:18.204930', '_unique_id': '40bb2491480549a1bf25c7204e88ff24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.206 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '674d8021-d3e3-4d20-b717-afdfbddce903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.206400', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a14bedd6-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'c42052d66c0e21176f18e8bdfbf07084c3607104f8d0ed07ef5d221ce219e6dd'}]}, 'timestamp': '2026-02-20 09:29:18.206622', '_unique_id': '242b0afdf47d487c93279d9078a47e43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.231 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.232 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8d30e0e-18de-4447-9880-e9cfd38d1652', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.207604', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a14fdc0c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '8855a4e030ad1beea095494c845561e15e1d96f80f34f4d7082fe96e59e40f27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.207604', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a14fe5b2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'ef9e4f77b65d70161f7b02e606413ff98c419f233125ec616ebbd26b456a43d2'}]}, 'timestamp': '2026-02-20 09:29:18.232624', '_unique_id': '67053b086f2a4659a9b6848f9195119f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1ef7c7c-803c-4673-85cc-c93db4c487bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.234090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a1518dea-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': 'ab47198e338af69c443f9a3f7eb22c8c116884d3580d4829d7f68cc3342c72a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.234090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a1519600-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '6cd8ee387e76c928496c6568531450697a568e36d2e4c0a34077f68ca0ac9500'}]}, 'timestamp': '2026-02-20 09:29:18.243702', '_unique_id': '28df2373523344f6b1adfe30fcc8d99c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '615ef17a-d306-485d-a4ff-c12714c34758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.244945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a151cf8a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '85f120fd046dc755bb35938d412273bb8b384fb439a7376b09b0f1b1dafd4123'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.244945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a151d746-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '6fc1e83de4112db3f490703e811af9bdda982c5fdfe746fac7a5053bd59bee11'}]}, 'timestamp': '2026-02-20 09:29:18.245349', '_unique_id': 'f96c720c4f8c4b0ab72f9c91f44e21bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.246 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da5306b-de6b-470d-b9b2-bdd25e240dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.246374', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1520752-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'a28af7f59cef164d752a584ebd6f911a3e9ad7da0f9df21af28e8e0f9ef184fd'}]}, 'timestamp': '2026-02-20 09:29:18.246593', '_unique_id': '60866eb962a34589818b911c531fa471'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf6ae61c-d580-4a33-bc60-a67c2dceb18d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.247543', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1523650-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '5ca9d47052d7d95316aad629cd2a3c376f37b36672a756e9aaa466f1729fef2e'}]}, 'timestamp': '2026-02-20 09:29:18.247797', '_unique_id': '39b2b0a092a84938b0b8f5147fccf8de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c855185-63fd-4ad0-a6c7-80659a2b4b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.248778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a15264ea-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '4afe277bdebb90ac7c9f2f32be4366a3df35dc0d271fdeed09a17cedc2abd754'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.248778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a1526c1a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'f9f59808ca26138d786844e73a2a985e72e04f90815523a18b0386858743e727'}]}, 'timestamp': '2026-02-20 09:29:18.249158', '_unique_id': 'f6dc3c70e7914b19a4a43ec31f0a51a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f951ad5c-8f33-42e7-837e-48cc474bacaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.251072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a152be9a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '3233aeea1d611f4cf349c687aa414a7e8731227076f38c51094bbb6817550aa4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.251072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a152c61a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'fe290eae114bde1d46eb35cf47c622dfb9ab8c5a92a88f015803545fcea5b1d9'}]}, 'timestamp': '2026-02-20 09:29:18.251463', '_unique_id': '0bf6f7e4f5f64801893d8d1cbedc160b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9be8c296-d9a8-4d1c-b690-385467e6d8a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.252472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a152f536-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '9b7e1a6431022c0fe195139ba46627877232b723a42e124c238fd38ee62a25d3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.252472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a152fd38-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '413c7fbd7a984336ce1376aa6cda2a657c30ee18eba7218d2e8c1f010c4e3b1d'}]}, 'timestamp': '2026-02-20 09:29:18.252874', '_unique_id': 'ff2e3887d6b24eb0a2bbe160306f6330'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1684e9f9-3910-44dd-86c5-37d62d7c8c0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.254705', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1534cac-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '2f0e4d04ad3ade9fc17264c0aefc934c403e63283da3ae6e7adfbdabe166d5fe'}]}, 'timestamp': '2026-02-20 09:29:18.254922', '_unique_id': 'c9ac08111c6c4d3ba21f59e74efa9cba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72081d2c-cda2-4bd1-9b51-b44fc7d0ed04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.255879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a1537a42-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '65222d989d5213a39a7b80b320a78a4c4dc36816753ab6231772758eaedbcbb9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.255879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a153821c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'add27f31c7f120bec4a0e199292e0fe7644237d56a6064e39c8323c9b80f15af'}]}, 'timestamp': '2026-02-20 09:29:18.256275', '_unique_id': '90ff62f8f23e480c83be3c47e5531b9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4eaf06c-f0c5-45f8-a82d-f60d144a3652', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.257241', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a153af94-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '6dfef66d421a4006d909b765b0f94a6bba0c35d8b99a2e4b400e214c8755260a'}]}, 'timestamp': '2026-02-20 09:29:18.257453', '_unique_id': 'b6e989bf6a1941f4b990053670766a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c9d7012-d1fb-415d-a0a0-a0a9983119fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.258405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a153dce4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '4beb8830f1bd77919ad9e07c5c468b45128f0a858a327f8eefa7902756c52d9d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.258405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a153e4be-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'fe22c0b7ba3d8458427e09586ffd1b097659f01c043a41668fc05e874a80562d'}]}, 'timestamp': '2026-02-20 09:29:18.258799', '_unique_id': '19b39702213e4782bcf9fbfb59ea9581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68ddd221-8431-4be9-a38a-959d0b097acd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.259756', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a15411be-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'bec66284c9054bd7c50b45e959974de50e9c4ef228f5309444601af5a7dcbe2c'}]}, 'timestamp': '2026-02-20 09:29:18.259965', '_unique_id': 'e9e8e1d189eb42219a657ef8a22b22c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11a4730f-261a-4dff-aab5-e3091e1eabcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.260943', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1544026-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'd23d940dcdf4ffec28db47ed41f40dabc7fe870d3a3af500715c955a3b85cdb1'}]}, 'timestamp': '2026-02-20 09:29:18.261152', '_unique_id': '74f06d3c01394abe83699abaaaa6421f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 59050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faf94a01-bbf0-4ccd-87c2-927dcc0d1a2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59050000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:29:18.262088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a156ef10-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.517559022, 'message_signature': 'abb1ce2d2d501ea1df316cea1df7259bbc7a69942221e354484134e991fa06a7'}]}, 'timestamp': '2026-02-20 09:29:18.278752', '_unique_id': 'e96de3ff7b9445ab82c209d8640f34d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8a73df8-0963-499b-9227-320ee3ab0ea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.279702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a1571cd8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '75b999193a6df88a3e01c71d52b147f35f322eda619fdd93072f139175902a06'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.279702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a15723ea-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': 'd825d57d3f112e4847d83af33c9a98bbc2afd5c430c0dcbce4577295c88adf45'}]}, 'timestamp': '2026-02-20 09:29:18.280091', '_unique_id': 'afb6d32d2ba245fc99f305c3a48e3ca7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83d1ef13-6899-4a0f-84d1-4a2ed95ae28d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:29:18.281052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a1575194-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.517559022, 'message_signature': 'fb7bf8fc523c1ba8c45551b98ede7d80677cc3edd7fcf81c72814b8df02dda17'}]}, 'timestamp': '2026-02-20 09:29:18.281253', '_unique_id': '27bc6bfdc6fc4723928de10410f46e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db7c1574-441e-4f68-9bcc-8d4388649fbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.282248', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a157804c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '4e10ed92719cb2163280ea8955b00e90f9faaf7ab96a204be125121ad41e74d3'}]}, 'timestamp': '2026-02-20 09:29:18.282455', '_unique_id': 'f16a6ebc6f434acdbc9a4463c003a6dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.283 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd46b0e91-30b9-458a-94e0-c58d06bed979', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.283395', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a157ad24-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '64eedd6f68a274c2a36b8b6ad5b5e8b287eacded519fd9d80b4951652cd329f2'}]}, 'timestamp': '2026-02-20 09:29:18.283605', '_unique_id': '4eb41026091a4cc4b09e20c31f367ea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:29:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Feb 20 04:29:18 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:29:18 localhost python3.9[247892]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:19 localhost python3.9[248002]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman Feb 20 04:29:19 localhost nova_compute[230552]: 2026-02-20 09:29:19.734 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:19 localhost nova_compute[230552]: 2026-02-20 09:29:19.737 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:19 localhost nova_compute[230552]: 2026-02-20 09:29:19.737 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:19 localhost nova_compute[230552]: 2026-02-20 09:29:19.737 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:19 localhost nova_compute[230552]: 2026-02-20 09:29:19.764 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:19 localhost nova_compute[230552]: 2026-02-20 09:29:19.765 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:20 localhost sshd[248015]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:29:20 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:20 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61964 DF PROTO=TCP SPT=43588 DPT=9882 SEQ=1917522212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B96A90000000001030307) Feb 20 04:29:20 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:20 localhost systemd[1]: libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope: Deactivated successfully. Feb 20 04:29:20 localhost podman[247772]: 2026-02-20 09:29:20.872846031 +0000 UTC m=+2.546478101 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:29:20 localhost podman[247772]: 2026-02-20 09:29:20.932170178 +0000 UTC m=+2.605802278 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:29:22 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:22 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:22 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:22 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:29:23 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:29:23 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:23 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:24 localhost podman[248028]: 2026-02-20 09:29:24.029788033 +0000 UTC m=+0.256483102 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.7, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 20 04:29:24 localhost podman[248028]: 2026-02-20 09:29:24.049127797 +0000 UTC m=+0.275822826 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, distribution-scope=public, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter) Feb 20 04:29:24 localhost python3.9[248156]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:24 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:29:24 localhost systemd[1]: Started libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope. Feb 20 04:29:24 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:24 localhost nova_compute[230552]: 2026-02-20 09:29:24.765 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:24 localhost podman[248157]: 2026-02-20 09:29:24.770919613 +0000 UTC m=+0.136436477 container exec 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:29:24 localhost podman[248157]: 2026-02-20 09:29:24.803148803 +0000 UTC m=+0.168665667 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:29:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61965 DF PROTO=TCP SPT=43588 DPT=9882 SEQ=1917522212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BA6680000000001030307) Feb 20 04:29:24 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:29:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:29:25 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:26 localhost python3.9[248320]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:27 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:27 localhost systemd[1]: var-lib-containers-storage-overlay-a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95-merged.mount: Deactivated successfully. Feb 20 04:29:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58311 DF PROTO=TCP SPT=48800 DPT=9105 SEQ=3761662783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BB1280000000001030307) Feb 20 04:29:27 localhost systemd[1]: libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope: Deactivated successfully. Feb 20 04:29:27 localhost podman[248187]: 2026-02-20 09:29:27.74003001 +0000 UTC m=+2.184054450 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 20 04:29:27 localhost podman[248188]: 2026-02-20 09:29:27.793076533 +0000 UTC m=+2.233197208 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:29:27 localhost podman[248188]: 2026-02-20 09:29:27.801942839 +0000 UTC m=+2.242063554 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 20 04:29:27 localhost systemd[1]: Started libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope. Feb 20 04:29:27 localhost podman[248187]: 2026-02-20 09:29:27.90820336 +0000 UTC m=+2.352227820 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:29:27 localhost podman[248321]: 2026-02-20 09:29:27.917439458 +0000 UTC m=+1.649754259 container exec 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:29:27 localhost podman[248321]: 2026-02-20 09:29:27.951207068 +0000 UTC m=+1.683521879 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:29:28 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 20 04:29:28 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:29:28 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:29:28 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:29:28 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 20 04:29:28 localhost systemd[1]: libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope: Deactivated successfully. Feb 20 04:29:29 localhost python3.9[248480]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:29 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 20 04:29:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58312 DF PROTO=TCP SPT=48800 DPT=9105 SEQ=3761662783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BB9280000000001030307) Feb 20 04:29:29 localhost nova_compute[230552]: 2026-02-20 09:29:29.768 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:29 localhost nova_compute[230552]: 2026-02-20 09:29:29.770 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:29 localhost nova_compute[230552]: 2026-02-20 09:29:29.770 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:29 localhost nova_compute[230552]: 2026-02-20 09:29:29.770 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:29 localhost nova_compute[230552]: 2026-02-20 09:29:29.772 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:29 localhost nova_compute[230552]: 2026-02-20 09:29:29.772 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:29 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 20 04:29:30 localhost python3.9[248590]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman Feb 20 04:29:31 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:29:31 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 20 04:29:31 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 20 04:29:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26776 DF PROTO=TCP SPT=40870 DPT=9100 SEQ=2551385514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BC5680000000001030307) Feb 20 04:29:32 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:29:33 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:29:33 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 20 04:29:33 localhost podman[248604]: 2026-02-20 09:29:33.181234995 +0000 UTC m=+0.184564900 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 20 04:29:33 localhost podman[248604]: 2026-02-20 09:29:33.196034273 +0000 UTC m=+0.199364238 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:29:33 localhost python3.9[248730]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:34 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:29:34 localhost systemd[1]: Started libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope. Feb 20 04:29:34 localhost podman[248731]: 2026-02-20 09:29:34.400225984 +0000 UTC m=+0.540699629 container exec 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, name=ubi9/ubi-minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 20 04:29:34 localhost podman[248731]: 2026-02-20 09:29:34.435148012 +0000 UTC m=+0.575621617 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, version=9.7, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 20 04:29:34 localhost nova_compute[230552]: 2026-02-20 09:29:34.773 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:34 localhost nova_compute[230552]: 2026-02-20 09:29:34.775 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:34 localhost nova_compute[230552]: 2026-02-20 09:29:34.775 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:34 localhost nova_compute[230552]: 2026-02-20 09:29:34.776 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:34 localhost nova_compute[230552]: 2026-02-20 09:29:34.812 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:34 localhost nova_compute[230552]: 2026-02-20 09:29:34.813 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:35 localhost systemd[1]: libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope: Deactivated successfully. Feb 20 04:29:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10338 DF PROTO=TCP SPT=39658 DPT=9101 SEQ=3112116702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BD3350000000001030307) Feb 20 04:29:36 localhost python3.9[248869]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 20 04:29:36 localhost systemd[1]: Started libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope. Feb 20 04:29:36 localhost podman[248870]: 2026-02-20 09:29:36.479912003 +0000 UTC m=+0.106316603 container exec 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible) Feb 20 04:29:36 localhost podman[248870]: 2026-02-20 09:29:36.508951061 +0000 UTC m=+0.135355621 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-type=git, maintainer=Red Hat, Inc.) Feb 20 04:29:36 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 20 04:29:36 localhost systemd[1]: var-lib-containers-storage-overlay-b52b75a7380249fd6beb40dca6e23a5c2c2b3650de6523e005db6f52b5fe90d0-merged.mount: Deactivated successfully. Feb 20 04:29:37 localhost python3.9[249008]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:38 localhost python3.9[249118]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:39 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:39 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10340 DF PROTO=TCP SPT=39658 DPT=9101 SEQ=3112116702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BDF280000000001030307) Feb 20 04:29:39 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:39 localhost systemd[1]: libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope: Deactivated successfully. Feb 20 04:29:39 localhost python3.9[249228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:39 localhost nova_compute[230552]: 2026-02-20 09:29:39.814 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:39 localhost nova_compute[230552]: 2026-02-20 09:29:39.817 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:39 localhost nova_compute[230552]: 2026-02-20 09:29:39.817 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:39 localhost nova_compute[230552]: 2026-02-20 09:29:39.817 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:39 localhost nova_compute[230552]: 2026-02-20 09:29:39.856 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:39 localhost nova_compute[230552]: 2026-02-20 09:29:39.856 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:40 localhost python3.9[249316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579779.3375716-3717-12228653072524/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:41 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:41 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:41 localhost python3.9[249426]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:41 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:41 localhost python3.9[249536]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58314 DF PROTO=TCP SPT=48800 DPT=9105 SEQ=3761662783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BE9680000000001030307) Feb 20 04:29:42 localhost python3.9[249593]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:42 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:29:42 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:42 localhost podman[249611]: 2026-02-20 09:29:42.683214095 +0000 UTC m=+0.095226345 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:29:42 localhost podman[249611]: 2026-02-20 09:29:42.691630187 +0000 UTC m=+0.103642397 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:29:42 localhost podman[249611]: unhealthy Feb 20 04:29:42 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:29:42 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:29:43 localhost python3.9[249725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:43 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:43 localhost python3.9[249782]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5fp4eekw recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:43 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:43 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:44 localhost python3.9[249892]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:44 localhost python3.9[249949]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:44 localhost nova_compute[230552]: 2026-02-20 09:29:44.857 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:45 localhost python3.9[250059]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:29:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6765 DF PROTO=TCP SPT=37224 DPT=9102 SEQ=3285053212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BF9680000000001030307) Feb 20 04:29:46 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:46 localhost python3[250170]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 20 04:29:46 localhost systemd[1]: var-lib-containers-storage-overlay-81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24-merged.mount: Deactivated successfully. Feb 20 04:29:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:29:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:29:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21189 DF PROTO=TCP SPT=58658 DPT=9882 SEQ=85989402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BFFBA0000000001030307) Feb 20 04:29:47 localhost python3.9[250280]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:48 localhost python3.9[250337]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:48 localhost python3.9[250447]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:48 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:49 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:49 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:49 localhost python3.9[250504]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:49 localhost nova_compute[230552]: 2026-02-20 09:29:49.860 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:49 localhost nova_compute[230552]: 2026-02-20 09:29:49.862 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:49 localhost nova_compute[230552]: 2026-02-20 09:29:49.863 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:49 localhost nova_compute[230552]: 2026-02-20 09:29:49.863 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:49 localhost nova_compute[230552]: 2026-02-20 09:29:49.898 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:49 localhost nova_compute[230552]: 2026-02-20 09:29:49.898 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:50 localhost python3.9[250614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21191 DF PROTO=TCP SPT=58658 DPT=9882 SEQ=85989402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C0BA80000000001030307) Feb 20 04:29:50 localhost python3.9[250671]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:51 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:51 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:29:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:29:51 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:51 localhost python3.9[250781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:52 localhost python3.9[250838]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:52 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:52 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:52 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:29:52 localhost python3.9[250948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:29:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:29:53 localhost podman[250967]: 2026-02-20 09:29:53.156927331 +0000 UTC m=+0.088825129 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:29:53 localhost podman[250967]: 2026-02-20 09:29:53.163623047 +0000 UTC m=+0.095520875 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:29:53 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:29:53 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:29:53 localhost python3.9[251061]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579792.422267-4092-214355718096356/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:53 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:29:54 localhost python3.9[251171]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21192 DF PROTO=TCP SPT=58658 DPT=9882 SEQ=85989402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C1B680000000001030307) Feb 20 04:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:29:54 localhost nova_compute[230552]: 2026-02-20 09:29:54.899 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:54 localhost nova_compute[230552]: 2026-02-20 09:29:54.901 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:54 localhost nova_compute[230552]: 2026-02-20 09:29:54.901 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:54 localhost nova_compute[230552]: 2026-02-20 09:29:54.901 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:54 localhost nova_compute[230552]: 2026-02-20 09:29:54.925 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:54 localhost nova_compute[230552]: 2026-02-20 09:29:54.925 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:54 localhost podman[251282]: 2026-02-20 09:29:54.9263047 +0000 UTC m=+0.119148847 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, distribution-scope=public, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 20 04:29:54 localhost podman[251282]: 2026-02-20 09:29:54.939992902 +0000 UTC m=+0.132837089 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:29:55 localhost python3.9[251281]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:29:55 localhost python3.9[251413]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:56 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:56 localhost systemd[1]: var-lib-containers-storage-overlay-fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe-merged.mount: Deactivated successfully. Feb 20 04:29:56 localhost systemd[1]: var-lib-containers-storage-overlay-fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe-merged.mount: Deactivated successfully. Feb 20 04:29:56 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:29:56 localhost python3.9[251523]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:29:57 localhost python3.9[251634]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:29:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46172 DF PROTO=TCP SPT=60534 DPT=9105 SEQ=285388273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C26680000000001030307) Feb 20 04:29:58 localhost python3.9[251746]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:29:58 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:29:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:29:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:29:58 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:29:58 localhost podman[251750]: 2026-02-20 09:29:58.915674169 +0000 UTC m=+0.089290174 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:29:58 localhost podman[251751]: 2026-02-20 09:29:58.990579238 +0000 UTC m=+0.160040099 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Feb 20 04:29:59 localhost podman[251750]: 2026-02-20 09:29:59.01914226 +0000 UTC m=+0.192758315 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true) Feb 20 04:29:59 localhost podman[251751]: 2026-02-20 09:29:59.075781518 +0000 UTC m=+0.245242349 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:29:59 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:29:59 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:29:59 localhost python3.9[251904]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:29:59 localhost openstack_network_exporter[244414]: ERROR 09:29:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:29:59 localhost openstack_network_exporter[244414]: Feb 20 04:29:59 localhost openstack_network_exporter[244414]: ERROR 09:29:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:29:59 localhost openstack_network_exporter[244414]: Feb 20 04:29:59 localhost nova_compute[230552]: 2026-02-20 09:29:59.926 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:29:59 localhost nova_compute[230552]: 2026-02-20 09:29:59.927 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:29:59 localhost nova_compute[230552]: 2026-02-20 09:29:59.928 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:29:59 localhost nova_compute[230552]: 2026-02-20 09:29:59.928 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:59 localhost nova_compute[230552]: 2026-02-20 09:29:59.929 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:29:59 localhost nova_compute[230552]: 2026-02-20 09:29:59.933 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:00 localhost systemd[1]: session-57.scope: Deactivated successfully. Feb 20 04:30:00 localhost systemd[1]: session-57.scope: Consumed 27.754s CPU time. Feb 20 04:30:00 localhost systemd-logind[759]: Session 57 logged out. Waiting for processes to exit. Feb 20 04:30:00 localhost systemd-logind[759]: Removed session 57. Feb 20 04:30:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48581 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C32440000000001030307) Feb 20 04:30:00 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:30:00 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:30:01 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 20 04:30:01 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:02 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:30:02 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:30:02 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48583 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C3E690000000001030307) Feb 20 04:30:04 localhost nova_compute[230552]: 2026-02-20 09:30:04.929 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:30:05 localhost podman[251925]: 2026-02-20 09:30:05.146894187 +0000 UTC m=+0.087387039 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute) Feb 20 04:30:05 localhost podman[251925]: 2026-02-20 09:30:05.193035662 +0000 UTC m=+0.133528454 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute) Feb 20 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 20 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207-merged.mount: Deactivated successfully. Feb 20 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207-merged.mount: Deactivated successfully. Feb 20 04:30:05 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:30:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:30:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:30:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:30:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:30:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:30:05.994 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:30:06 localhost sshd[251980]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:30:06 localhost systemd-logind[759]: New session 58 of user zuul. Feb 20 04:30:06 localhost systemd[1]: Started Session 58 of User zuul. Feb 20 04:30:06 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:06 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 20 04:30:06 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 20 04:30:07 localhost python3.9[252126]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config/container-startup-config/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:07 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:07 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48584 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C4E280000000001030307) Feb 20 04:30:07 localhost python3.9[252236]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:08 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:08 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 20 04:30:08 localhost systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully. Feb 20 04:30:08 localhost python3.9[252346]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:09 localhost python3.9[252472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:09 localhost nova_compute[230552]: 2026-02-20 09:30:09.932 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:10 localhost systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully. Feb 20 04:30:10 localhost systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully. Feb 20 04:30:10 localhost systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully. Feb 20 04:30:10 localhost python3.9[252558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579809.2405798-101-217869708377355/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:11 localhost python3.9[252666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:11 localhost python3.9[252752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579810.4102385-101-236777048400658/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:12 localhost systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully. Feb 20 04:30:12 localhost systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully. Feb 20 04:30:12 localhost systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully. Feb 20 04:30:12 localhost python3.9[252860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:30:12 localhost python3.9[252946]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579811.9420767-101-24524186812911/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=ef2f7fbed7b4b53fbfecfbf9796227b8acb52519 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:12 localhost podman[252947]: 2026-02-20 09:30:12.89567664 +0000 UTC m=+0.080044343 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:30:12 localhost podman[252947]: 2026-02-20 09:30:12.930135054 +0000 UTC m=+0.114502717 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:30:12 localhost podman[252947]: unhealthy Feb 20 04:30:13 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:30:13 localhost systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully. Feb 20 04:30:13 localhost systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully. Feb 20 04:30:13 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE Feb 20 04:30:13 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'. Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.301 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.301 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.302 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:14 localhost python3.9[253077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:14 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:30:14 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:30:14 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 20 04:30:14 localhost python3.9[253163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579813.9847262-276-149246084429781/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=13d630d090b626c2aab1085bca0daa7abb0cabfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.936 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.938 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.938 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.938 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.995 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:14 localhost nova_compute[230552]: 2026-02-20 09:30:14.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:15 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.321 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.321 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.321 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:30:15 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:30:15 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 20 04:30:15 localhost python3.9[253272]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:30:15 localhost nova_compute[230552]: 2026-02-20 09:30:15.796 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.119 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.119 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:30:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48585 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C6F690000000001030307) Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.317 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.318 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12445MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.318 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.319 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:30:16 localhost python3.9[253405]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.493 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.493 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.493 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:30:16 localhost nova_compute[230552]: 2026-02-20 09:30:16.543 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:30:17 localhost nova_compute[230552]: 2026-02-20 09:30:16.998 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:30:17 localhost nova_compute[230552]: 2026-02-20 09:30:17.006 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:30:17 localhost nova_compute[230552]: 2026-02-20 09:30:17.026 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:30:17 localhost nova_compute[230552]: 2026-02-20 09:30:17.029 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:30:17 localhost nova_compute[230552]: 2026-02-20 09:30:17.029 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:30:17 localhost python3.9[253535]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:17 localhost systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully. Feb 20 04:30:17 localhost python3.9[253594]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:30:18 localhost python3.9[253704]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.620 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.636 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.636 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.637 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.637 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:18 localhost nova_compute[230552]: 2026-02-20 09:30:18.637 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:30:19 localhost python3.9[253761]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:30:19 localhost python3.9[253871]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:19 localhost nova_compute[230552]: 2026-02-20 09:30:19.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:20 localhost nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:20 localhost nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:20 localhost nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:20 localhost nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:20 localhost nova_compute[230552]: 2026-02-20 09:30:20.000 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:20 localhost nova_compute[230552]: 2026-02-20 09:30:20.906 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:30:21 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:30:21 localhost python3.9[253981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:21 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:30:21 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 20 04:30:21 localhost python3.9[254038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:22 localhost python3.9[254148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:22 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:22 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:30:22 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 20 04:30:22 localhost python3.9[254205]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:23 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:30:23 localhost systemd[1]: tmp-crun.u5XTcn.mount: Deactivated successfully. Feb 20 04:30:23 localhost podman[254315]: 2026-02-20 09:30:23.742529746 +0000 UTC m=+0.103669472 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:30:23 localhost podman[254315]: 2026-02-20 09:30:23.752147712 +0000 UTC m=+0.113287448 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:30:23 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:23 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:30:23 localhost python3.9[254316]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:30:23 localhost systemd[1]: Reloading. Feb 20 04:30:24 localhost systemd-rc-local-generator[254358]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:30:24 localhost systemd-sysv-generator[254361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:24 localhost python3.9[254485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:25 localhost nova_compute[230552]: 2026-02-20 09:30:25.000 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:25 localhost nova_compute[230552]: 2026-02-20 09:30:25.003 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:25 localhost nova_compute[230552]: 2026-02-20 09:30:25.003 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:25 localhost nova_compute[230552]: 2026-02-20 09:30:25.003 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:25 localhost nova_compute[230552]: 2026-02-20 09:30:25.054 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:25 localhost nova_compute[230552]: 2026-02-20 09:30:25.054 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:25 localhost python3.9[254542]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:25 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 20 04:30:25 localhost systemd[1]: var-lib-containers-storage-overlay-2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb-merged.mount: Deactivated successfully. Feb 20 04:30:25 localhost systemd[1]: var-lib-containers-storage-overlay-2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb-merged.mount: Deactivated successfully. Feb 20 04:30:26 localhost python3.9[254652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:26 localhost openstack_network_exporter[244414]: ERROR 09:30:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:30:26 localhost openstack_network_exporter[244414]: Feb 20 04:30:26 localhost openstack_network_exporter[244414]: ERROR 09:30:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:30:26 localhost openstack_network_exporter[244414]: Feb 20 04:30:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:30:26 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 20 04:30:26 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:30:26 localhost podman[254712]: 2026-02-20 09:30:26.775388504 +0000 UTC m=+0.087519593 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9) Feb 20 04:30:26 localhost podman[254712]: 2026-02-20 09:30:26.793201845 +0000 UTC m=+0.105332924 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, release=1770267347, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container) Feb 20 04:30:26 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:30:26 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:30:26 localhost python3.9[254711]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:27 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:27 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 20 04:30:28 localhost python3.9[254841]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:30:28 localhost systemd[1]: Reloading. Feb 20 04:30:28 localhost systemd-sysv-generator[254871]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:30:28 localhost systemd-rc-local-generator[254866]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:28 localhost systemd[1]: Starting Create netns directory... Feb 20 04:30:28 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 04:30:28 localhost systemd[1]: Finished Create netns directory. Feb 20 04:30:28 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 20 04:30:28 localhost systemd[1]: var-lib-containers-storage-overlay-748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a-merged.mount: Deactivated successfully. Feb 20 04:30:28 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Feb 20 04:30:28 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Feb 20 04:30:28 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Feb 20 04:30:29 localhost podman[241968]: @ - - [20/Feb/2026:09:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 143381 "" "Go-http-client/1.1" Feb 20 04:30:29 localhost podman_exporter[241957]: ts=2026-02-20T09:30:29.049Z caller=exporter.go:96 level=info msg="Listening on" address=:9882 Feb 20 04:30:29 localhost podman_exporter[241957]: ts=2026-02-20T09:30:29.050Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882 Feb 20 04:30:29 localhost podman_exporter[241957]: ts=2026-02-20T09:30:29.050Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882 Feb 20 04:30:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:30:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:30:29 localhost podman[254996]: 2026-02-20 09:30:29.434858455 +0000 UTC m=+0.099732250 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:30:29 localhost podman[254996]: 2026-02-20 09:30:29.47711465 +0000 UTC m=+0.141988465 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2) Feb 20 04:30:29 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:30:29 localhost podman[254997]: 2026-02-20 09:30:29.485652024 +0000 UTC m=+0.151064806 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:30:29 localhost python3.9[254998]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:29 localhost podman[254997]: 2026-02-20 09:30:29.566030165 +0000 UTC m=+0.231442947 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:30:29 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:30:30 localhost nova_compute[230552]: 2026-02-20 09:30:30.053 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24047 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CA7740000000001030307) Feb 20 04:30:30 localhost python3.9[255148]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:30:31 localhost python3.9[255258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24048 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CAB680000000001030307) Feb 20 04:30:32 localhost python3.9[255346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579830.9900656-710-1157605902233/.source.json _original_basename=.mtvfow7d follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48586 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CAF680000000001030307) Feb 20 04:30:33 localhost python3.9[255454]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24049 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CB3680000000001030307) Feb 20 04:30:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6767 DF PROTO=TCP SPT=37224 DPT=9102 SEQ=3285053212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CB7690000000001030307) Feb 20 04:30:35 localhost nova_compute[230552]: 2026-02-20 09:30:35.057 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:35 localhost nova_compute[230552]: 2026-02-20 09:30:35.059 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:35 localhost nova_compute[230552]: 2026-02-20 09:30:35.060 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:35 localhost nova_compute[230552]: 2026-02-20 09:30:35.060 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:35 localhost nova_compute[230552]: 2026-02-20 09:30:35.081 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:35 localhost nova_compute[230552]: 2026-02-20 09:30:35.082 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:35 localhost python3.9[255758]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False Feb 20 04:30:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:30:36 localhost podman[255776]: 2026-02-20 09:30:36.153609995 +0000 UTC m=+0.088254946 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:30:36 localhost podman[255776]: 2026-02-20 09:30:36.193223458 +0000 UTC m=+0.127868369 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute) Feb 20 04:30:36 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:30:36 localhost python3.9[255888]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:30:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24050 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CC3280000000001030307) Feb 20 04:30:37 localhost python3[255998]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json containers=['neutron_sriov_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:30:38 localhost podman[256035]: Feb 20 04:30:38 localhost podman[256035]: 2026-02-20 09:30:38.220199492 +0000 UTC m=+0.118180961 container create ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:30:38 localhost podman[256035]: 2026-02-20 09:30:38.140205921 +0000 UTC m=+0.038187430 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 20 04:30:38 localhost python3[255998]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 20 04:30:40 localhost nova_compute[230552]: 2026-02-20 09:30:40.083 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:40 localhost nova_compute[230552]: 2026-02-20 09:30:40.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:40 localhost nova_compute[230552]: 2026-02-20 09:30:40.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:40 localhost nova_compute[230552]: 2026-02-20 09:30:40.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:40 localhost nova_compute[230552]: 2026-02-20 09:30:40.132 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:40 localhost nova_compute[230552]: 2026-02-20 09:30:40.132 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:40 localhost sshd[256091]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:30:42 localhost python3.9[256185]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:30:43 localhost python3.9[256297]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:44 localhost python3.9[256352]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:30:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:30:44 localhost podman[256359]: 2026-02-20 09:30:44.134057161 +0000 UTC m=+0.069826777 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:30:44 localhost podman[256359]: 2026-02-20 09:30:44.175122318 +0000 UTC m=+0.110891914 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:30:44 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:30:44 localhost python3.9[256485]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579844.0837593-944-173267184265401/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:45 localhost nova_compute[230552]: 2026-02-20 09:30:45.133 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:45 localhost nova_compute[230552]: 2026-02-20 09:30:45.134 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:45 localhost nova_compute[230552]: 2026-02-20 09:30:45.135 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:45 localhost nova_compute[230552]: 2026-02-20 09:30:45.135 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:45 localhost nova_compute[230552]: 2026-02-20 09:30:45.136 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:45 localhost nova_compute[230552]: 2026-02-20 09:30:45.138 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:45 localhost python3.9[256540]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:30:45 localhost systemd[1]: Reloading. Feb 20 04:30:45 localhost systemd-rc-local-generator[256565]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:30:45 localhost systemd-sysv-generator[256569]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24051 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CE3690000000001030307) Feb 20 04:30:46 localhost python3.9[256631]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:30:46 localhost systemd[1]: Reloading. Feb 20 04:30:46 localhost systemd-sysv-generator[256664]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:30:46 localhost systemd-rc-local-generator[256658]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:30:46 localhost systemd[1]: Starting neutron_sriov_agent container... Feb 20 04:30:46 localhost systemd[1]: Started libcrun container. Feb 20 04:30:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 20 04:30:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:30:46 localhost podman[256672]: 2026-02-20 09:30:46.741613618 +0000 UTC m=+0.112529305 container init ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent) Feb 20 04:30:46 localhost podman[256672]: 2026-02-20 09:30:46.750601457 +0000 UTC m=+0.121517134 container start ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:30:46 localhost podman[256672]: neutron_sriov_agent Feb 20 04:30:46 localhost systemd[1]: Started neutron_sriov_agent container. Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + sudo -E kolla_set_configs Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Validating config file Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Copying service configuration files Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Writing out command to execute Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: ++ cat /run_command Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + CMD=/usr/bin/neutron-sriov-nic-agent Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + ARGS= Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + sudo kolla_copy_cacerts Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + [[ ! -n '' ]] Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + . kolla_extend_start Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: Running command: '/usr/bin/neutron-sriov-nic-agent' Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + umask 0022 Feb 20 04:30:46 localhost neutron_sriov_agent[256688]: + exec /usr/bin/neutron-sriov-nic-agent Feb 20 04:30:47 localhost podman[241968]: time="2026-02-20T09:30:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:30:47 localhost podman[241968]: @ - - [20/Feb/2026:09:30:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147334 "" "Go-http-client/1.1" Feb 20 04:30:47 localhost podman[241968]: @ - - [20/Feb/2026:09:30:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16341 "" "Go-http-client/1.1" Feb 20 04:30:48 localhost python3.9[256811]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.417 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.417 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005625204.localdomain'}#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.417 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] RPC agent_id: nic-switch-agent.np0005625204.localdomain#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.422 2 INFO neutron.agent.agent_extensions_manager [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Loaded agent extensions: ['qos']#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.422 2 INFO neutron.agent.agent_extensions_manager [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Initializing agent extension 'qos'#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.811 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Agent initialized successfully, now running... #033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.812 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Feb 20 04:30:48 localhost neutron_sriov_agent[256688]: 2026-02-20 09:30:48.812 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Agent out of sync with plugin!#033[00m Feb 20 04:30:49 localhost python3.9[256922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:30:49 localhost python3.9[257012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579848.678256-1079-101791592281590/.source.yaml _original_basename=.1ep_k65a follow=False checksum=9a7aca9285be233ff868b04cb9ff99cde755c904 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:30:50 localhost nova_compute[230552]: 2026-02-20 09:30:50.139 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:50 localhost python3.9[257122]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:30:50 localhost systemd[1]: Stopping neutron_sriov_agent container... Feb 20 04:30:50 localhost systemd[1]: libpod-ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae.scope: Deactivated successfully. Feb 20 04:30:50 localhost systemd[1]: libpod-ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae.scope: Consumed 1.757s CPU time. Feb 20 04:30:50 localhost podman[257126]: 2026-02-20 09:30:50.792898132 +0000 UTC m=+0.090756814 container died ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible) Feb 20 04:30:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae-userdata-shm.mount: Deactivated successfully. Feb 20 04:30:50 localhost systemd[1]: var-lib-containers-storage-overlay-5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5-merged.mount: Deactivated successfully. Feb 20 04:30:50 localhost podman[257126]: 2026-02-20 09:30:50.842057759 +0000 UTC m=+0.139916421 container cleanup ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:30:50 localhost podman[257126]: neutron_sriov_agent Feb 20 04:30:50 localhost podman[257151]: 2026-02-20 09:30:50.929989075 +0000 UTC m=+0.058524818 container cleanup ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_sriov_agent, managed_by=edpm_ansible) Feb 20 04:30:50 localhost podman[257151]: neutron_sriov_agent Feb 20 04:30:50 localhost systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully. Feb 20 04:30:50 localhost systemd[1]: Stopped neutron_sriov_agent container. Feb 20 04:30:50 localhost systemd[1]: Starting neutron_sriov_agent container... Feb 20 04:30:51 localhost systemd[1]: Started libcrun container. Feb 20 04:30:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 20 04:30:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:30:51 localhost podman[257162]: 2026-02-20 09:30:51.095654759 +0000 UTC m=+0.121694068 container init ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:30:51 localhost podman[257162]: 2026-02-20 09:30:51.105791882 +0000 UTC m=+0.131831191 container start ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible) Feb 20 04:30:51 localhost podman[257162]: neutron_sriov_agent Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + sudo -E kolla_set_configs Feb 20 04:30:51 localhost systemd[1]: Started neutron_sriov_agent container. Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Validating config file Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Copying service configuration files Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Writing out command to execute Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: ++ cat /run_command Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + CMD=/usr/bin/neutron-sriov-nic-agent Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + ARGS= Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + sudo kolla_copy_cacerts Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + [[ ! -n '' ]] Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + . kolla_extend_start Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: Running command: '/usr/bin/neutron-sriov-nic-agent' Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + umask 0022 Feb 20 04:30:51 localhost neutron_sriov_agent[257177]: + exec /usr/bin/neutron-sriov-nic-agent Feb 20 04:30:51 localhost systemd[1]: session-58.scope: Deactivated successfully. Feb 20 04:30:51 localhost systemd[1]: session-58.scope: Consumed 23.096s CPU time. Feb 20 04:30:51 localhost systemd-logind[759]: Session 58 logged out. Waiting for processes to exit. Feb 20 04:30:51 localhost systemd-logind[759]: Removed session 58. Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.941 2 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.941 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005625204.localdomain'}#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] RPC agent_id: nic-switch-agent.np0005625204.localdomain#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.947 2 INFO neutron.agent.agent_extensions_manager [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Loaded agent extensions: ['qos']#033[00m Feb 20 04:30:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:52.948 2 INFO neutron.agent.agent_extensions_manager [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Initializing agent extension 'qos'#033[00m Feb 20 04:30:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:53.067 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Agent initialized successfully, now running... #033[00m Feb 20 04:30:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:53.067 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Feb 20 04:30:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:30:53.068 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Agent out of sync with plugin!#033[00m Feb 20 04:30:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:30:54 localhost podman[257210]: 2026-02-20 09:30:54.150381923 +0000 UTC m=+0.084979054 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:30:54 localhost podman[257210]: 2026-02-20 09:30:54.164099617 +0000 UTC m=+0.098696758 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:30:54 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:30:55 localhost nova_compute[230552]: 2026-02-20 09:30:55.141 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:55 localhost nova_compute[230552]: 2026-02-20 09:30:55.143 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:30:55 localhost nova_compute[230552]: 2026-02-20 09:30:55.143 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:30:55 localhost nova_compute[230552]: 2026-02-20 09:30:55.144 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:55 localhost nova_compute[230552]: 2026-02-20 09:30:55.180 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:30:55 localhost nova_compute[230552]: 2026-02-20 09:30:55.181 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:30:56 localhost openstack_network_exporter[244414]: ERROR 09:30:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:30:56 localhost openstack_network_exporter[244414]: Feb 20 04:30:56 localhost openstack_network_exporter[244414]: ERROR 09:30:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:30:56 localhost openstack_network_exporter[244414]: Feb 20 04:30:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:30:57 localhost systemd[1]: tmp-crun.3uclED.mount: Deactivated successfully. Feb 20 04:30:57 localhost podman[257233]: 2026-02-20 09:30:57.141951968 +0000 UTC m=+0.077966578 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:30:57 localhost podman[257233]: 2026-02-20 09:30:57.151221964 +0000 UTC m=+0.087236614 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.openshift.tags=minimal rhel9, distribution-scope=public) Feb 20 04:30:57 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:30:57 localhost sshd[257253]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:30:57 localhost systemd-logind[759]: New session 59 of user zuul. Feb 20 04:30:57 localhost systemd[1]: Started Session 59 of User zuul. Feb 20 04:30:58 localhost python3.9[257364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:30:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:30:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:30:59 localhost podman[257479]: 2026-02-20 09:30:59.763407735 +0000 UTC m=+0.084541121 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:30:59 localhost podman[257480]: 2026-02-20 09:30:59.774845238 +0000 UTC m=+0.094436227 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:30:59 localhost podman[257480]: 2026-02-20 09:30:59.778992716 +0000 UTC m=+0.098583765 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:30:59 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:30:59 localhost podman[257479]: 2026-02-20 09:30:59.838024108 +0000 UTC m=+0.159157504 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller) Feb 20 04:30:59 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:30:59 localhost python3.9[257478]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:31:00 localhost nova_compute[230552]: 2026-02-20 09:31:00.181 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:00 localhost nova_compute[230552]: 2026-02-20 09:31:00.184 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44627 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D1CA40000000001030307) Feb 20 04:31:00 localhost python3.9[257582]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:31:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44628 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D20A90000000001030307) Feb 20 04:31:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24052 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D23680000000001030307) Feb 20 04:31:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44629 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D28A90000000001030307) Feb 20 04:31:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48587 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D2D680000000001030307) Feb 20 04:31:05 localhost nova_compute[230552]: 2026-02-20 09:31:05.184 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:05 localhost nova_compute[230552]: 2026-02-20 09:31:05.184 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:05 localhost nova_compute[230552]: 2026-02-20 09:31:05.185 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:31:05 localhost nova_compute[230552]: 2026-02-20 09:31:05.185 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:05 localhost nova_compute[230552]: 2026-02-20 09:31:05.186 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:05 localhost nova_compute[230552]: 2026-02-20 09:31:05.188 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:05 localhost python3.9[257694]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 20 04:31:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:31:05.994 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:31:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:31:05.994 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:31:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:31:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:31:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:31:07 localhost systemd[1]: tmp-crun.JUuFlV.mount: Deactivated successfully. Feb 20 04:31:07 localhost podman[257698]: 2026-02-20 09:31:07.050405051 +0000 UTC m=+0.110190792 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:31:07 localhost podman[257698]: 2026-02-20 09:31:07.064066229 +0000 UTC m=+0.123851930 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:31:07 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:31:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44630 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D38680000000001030307) Feb 20 04:31:07 localhost python3.9[257826]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/container-startup-config setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:08 localhost sshd[257914]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:31:08 localhost python3.9[257937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:09 localhost python3.9[258047]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:10 localhost python3.9[258193]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:10 localhost nova_compute[230552]: 2026-02-20 09:31:10.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:10 localhost nova_compute[230552]: 2026-02-20 09:31:10.192 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:10 localhost nova_compute[230552]: 2026-02-20 09:31:10.192 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:31:10 localhost nova_compute[230552]: 2026-02-20 09:31:10.192 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:10 localhost nova_compute[230552]: 2026-02-20 09:31:10.240 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:10 localhost nova_compute[230552]: 2026-02-20 09:31:10.240 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:10 localhost python3.9[258334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:11 localhost python3.9[258462]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:11 localhost python3.9[258572]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:12 localhost python3.9[258682]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:13 localhost python3.9[258770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579872.2013342-275-8837160581450/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=472c5e922ae22c8bdcaef73d1ca73ce5597b440e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:14 localhost python3.9[258878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:31:15 localhost podman[258962]: 2026-02-20 09:31:15.148411832 +0000 UTC m=+0.083599160 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:31:15 localhost podman[258962]: 2026-02-20 09:31:15.155783292 +0000 UTC m=+0.090970650 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:31:15 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.241 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:15 localhost python3.9[258965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579874.2924902-320-135871475757954/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.275 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.275 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.319 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.319 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.320 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.321 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.736 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.807 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:31:15 localhost nova_compute[230552]: 2026-02-20 09:31:15.808 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.047 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.049 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12284MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.050 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.051 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.171 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.172 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.173 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:31:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44631 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D59680000000001030307) Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.228 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:31:16 localhost python3.9[259118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.705 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.711 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.726 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.728 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:31:16 localhost nova_compute[230552]: 2026-02-20 09:31:16.729 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:31:16 localhost python3.9[259224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579875.393824-320-76366033268633/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:17 localhost python3.9[259334]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:17 localhost podman[241968]: time="2026-02-20T09:31:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:31:17 localhost podman[241968]: @ - - [20/Feb/2026:09:31:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147332 "" "Go-http-client/1.1" Feb 20 04:31:17 localhost nova_compute[230552]: 2026-02-20 09:31:17.729 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:17 localhost nova_compute[230552]: 2026-02-20 09:31:17.730 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:17 localhost podman[241968]: @ - - [20/Feb/2026:09:31:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16350 "" "Go-http-client/1.1" Feb 20 04:31:17 localhost python3.9[259421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579877.022893-320-135267564593864/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=6db054ed7c6b84ef126ce933bbe7fb92f050e130 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.203 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.218 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.219 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '531d82e3-5967-440e-9426-4ead6decbd01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.204455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8d4698a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '99cae4041ed73933b7b5bade35c2f711763d32a5afa27892604c80c81fd86468'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.204455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8d48064-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': 'bace37cdafdc19835fa07ceb94dd3250728241e0420f50d2d75186bcfc9dfbe0'}]}, 'timestamp': '2026-02-20 09:31:18.220021', '_unique_id': '0e90f5dd52d041e88a4fa4890f8818e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.227 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2daf9df5-7d8c-45c4-9131-6f1df2e6a472', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.223211', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8d5aab6-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '287615887a73820205a6366b254d425838c151ffb3083b1927a21b5c9ebc3ad4'}]}, 'timestamp': '2026-02-20 09:31:18.227765', '_unique_id': '8e360c5a43b949ecb7d0b0c47a2c14a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.230 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.231 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ced7a21-0f42-4a5e-914a-ad26b17aee42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.230611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8d63274-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '5847cd8758f1c3856022ee86865163365bbaf01467263c08fe161f9c76333c51'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.230611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8d643fe-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '7bb701560840439bd996e7241176c86e57d636893f5de86e6e7ec430bd1e9fcf'}]}, 'timestamp': '2026-02-20 09:31:18.231592', '_unique_id': '6e4bec5b43354a50bcbae985bfe875a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.234 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf87186-e01d-49a4-97d8-6d4e55b7833c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.234121', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8d6b9e2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '1bc8486e5e6c3c51abbc4b7f27a0c958bf920500083b3c3c961dbb8656708949'}]}, 'timestamp': '2026-02-20 09:31:18.234607', '_unique_id': 'acb5c910c28f405fa0e88694cdac6657'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.236 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.275 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '844f284a-cca3-49e7-929b-9dcf92fa2aeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.237028', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8dcff0a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '34739f25bd2249ea1ff99ef0f8aac2b05b0ff1add913b03e7336e765d74292fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.237028', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8dd19f4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '6e3653412230c3a8f410aacb5e76322dd4c83f84675fd69809cd50f40bc86f4f'}]}, 'timestamp': '2026-02-20 09:31:18.276484', '_unique_id': '4fb935d9531544d3ae9919d06e4501a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f546e369-193c-4565-904b-6efd443020fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.279805', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8ddb71a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '6f4af09f8753669acdc94901ee1fcf0fb97bfffa2bc0817f84b1fade4742575d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.279805', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8ddcc00-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': 'c41734e1535c3d902c6597040acb33ebf463c9ccbc491d501829282fdab17984'}]}, 'timestamp': '2026-02-20 09:31:18.280962', '_unique_id': 'fd246f75139848af8859ba313bd3f6b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.283 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51fa73f1-9fc1-4849-a979-7d5439d901f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.283481', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8de4748-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '236024e03471189dfcc04cdbcc664eee90287d704132a2c8a37c87acfe38bbe5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.283481', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8de5c42-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '967281a09b76292031704f79d938ed62534ea0fd03e9139d76390d1ad5b6624b'}]}, 'timestamp': '2026-02-20 09:31:18.284611', '_unique_id': 'b680e3958f4944369f73ae3d48b9cffc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.286 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07461f73-e353-4aca-9e52-7176219690d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.286844', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8dec556-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '1444d7206779fda231c00bc06699a15ba7a9490e413297cffe1640cf7aee81d8'}]}, 'timestamp': '2026-02-20 09:31:18.287327', '_unique_id': '54f141135e444396aeb45cce9d9a04c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.289 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d44cc1e-ac3a-4d39-aca3-7e0fc0238e58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.289847', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8df3f36-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': 'aa09228a2eb024e551c5be8124bd39286805d20b8ad3364293c3fff752381f4d'}]}, 'timestamp': '2026-02-20 09:31:18.290448', '_unique_id': '6546353599194a809dad3c731cd37085'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b04d965-dfdd-4c9f-9970-d158abc4bf11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.292953', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8dfb3d0-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '02d4dbe53779ec08dbabb3c5f3154cbeb835aed4212a47fac1cb4450d936c5db'}]}, 'timestamp': '2026-02-20 09:31:18.293430', '_unique_id': '609914e2e8754724b8bfe353889478eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:31:18 localhost nova_compute[230552]: 2026-02-20 09:31:18.296 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:18 localhost nova_compute[230552]: 2026-02-20 09:31:18.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:18 localhost nova_compute[230552]: 2026-02-20 09:31:18.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:18 localhost nova_compute[230552]: 2026-02-20 09:31:18.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 60080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68c37424-e1da-4a6b-89e2-98e84665558d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60080000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:31:18.295544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e8e44a80-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.562021775, 'message_signature': '2a36048e9dd0bea7f634cc3963b1ce73a3de36d6e6f45438cc6f22df2cba3f55'}]}, 'timestamp': '2026-02-20 09:31:18.323533', '_unique_id': '4636f2fff18840bc9ea3c74998ca9881'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1282cf69-4393-47b9-aa68-25fe63ed5a32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.326098', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e4c2d0-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '7887e433aacfcb6ee4a21333155d41fff97b04b90a20a3e68bfb69ba23216c0b'}]}, 'timestamp': '2026-02-20 09:31:18.326599', '_unique_id': 'fbd9bf9b8e72474c8b8fb34a2dc1a31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c355ec3d-f46d-4bae-9e59-c97c2fe6be7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:31:18.329008', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e8e53490-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.562021775, 'message_signature': '9de4f74a1b869fe1b37e03e237470502975649ff41f4425596c3549bb51fb4d7'}]}, 'timestamp': '2026-02-20 09:31:18.329496', '_unique_id': 'cc570bceee8b427e93be81c20531843f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6ff050d-29e0-498a-8141-a014840d0936', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.331955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e5a772-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '3f90e05e70ca8a4796058c2472fb71a344a5bd9e5f7b05fda3fb03b5e92ac37a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.331955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e5bc08-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': 'a9883a030ce3e744efba5cc302396451c3bd060c951c0304c2a2da7d80896a7a'}]}, 'timestamp': '2026-02-20 09:31:18.333015', '_unique_id': 'ecf16736064d49448b031b82a44d119f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e567638b-5f74-4ca6-ac71-1ba9e2ed519d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.335368', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e62d28-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': 'a81f28a46599d16b6cf78bf269bbbf7125bc589aac44367c44bf5ae5173a1ace'}]}, 'timestamp': '2026-02-20 09:31:18.335909', '_unique_id': '59f5e9c781a34785904f34703035e613'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8617e1b4-2079-4ae9-a05a-bf0b214ce0b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.338319', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e6a05a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '4d9ec339e013dcb43b2fc46556dcfd81c276e54e0257974b337efb9fdf31d0b7'}]}, 'timestamp': '2026-02-20 09:31:18.338865', '_unique_id': '4fbaa9325a5e42fca8a938bfbf99199a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.341 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad9c339d-7dc1-49ac-8c1a-33752f0a42df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.341241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e7156c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '3e67649ded89e3fa9c1b4dde7cbd139cdd482aa2a655a6c82f7cea78682fc23b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.341241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e72930-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '3c4d6d2ac99eed3e9c5e76d9089b882ac0502400b5a291deabd075aeb51822ad'}]}, 'timestamp': '2026-02-20 09:31:18.342299', '_unique_id': '2566463369904443b8c6014ee8769ecb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.344 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eef6a48-b54c-4ece-9e07-05400eb1cf21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.344686', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e79960-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '015055ce0b62218d8af6e4e72aff84a61a3e8bf3ad40c312aaf16faa06926d95'}]}, 'timestamp': '2026-02-20 09:31:18.345202', '_unique_id': 'bccfdaa13bfe45b5a290ae2780c56ff7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.347 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.348 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc4923db-7801-4d2d-a55b-45d0ead1f0a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.347543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e80b84-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': 'bb238fad1ad6e5500ab1cb88a2450dc29c3ab64f9f32a6757413099a84bb8273'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.347543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e823e4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': 'b10e6db4746d002c7077d2dea7bbedef772d01731edefe1c47a44a741af664bf'}]}, 'timestamp': '2026-02-20 09:31:18.348766', '_unique_id': 'aac100ad402c458e9029c9dd55f1570f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.350 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5cfd6ae-ddff-487e-a495-6133027ab61a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.351081', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e892e8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '05b4397309f6fed456b7eb1b24112dbc75c8e082352ec717241dfe9d1b9422e7'}]}, 'timestamp': '2026-02-20 09:31:18.351708', '_unique_id': 'bf1edc6df27a4d2eadf497e5a247703b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.353 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3646bcc4-4d88-4678-8fb9-5c982e7f3808', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.353771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e8f774-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '7087c7e586754e2bbd32ba36af811441a67c73edd1e3898802ea37b25fcc5c86'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.353771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e901c4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '550b88a63520892d92505b5c41409e060c7bfa2d7b4744eddd88fe5cbf319c75'}]}, 'timestamp': '2026-02-20 09:31:18.354298', '_unique_id': '45cd21cd13fc410582d1deb52806b384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:31:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:31:19 localhost python3.9[259529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.571 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.571 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.572 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:31:19 localhost nova_compute[230552]: 2026-02-20 09:31:19.572 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:31:19 localhost python3.9[259615]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579878.7480402-494-147774350184856/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=13d630d090b626c2aab1085bca0daa7abb0cabfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:20 localhost nova_compute[230552]: 2026-02-20 09:31:20.075 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:31:20 localhost nova_compute[230552]: 2026-02-20 09:31:20.196 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:31:20 localhost nova_compute[230552]: 2026-02-20 09:31:20.197 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:31:20 localhost nova_compute[230552]: 2026-02-20 09:31:20.277 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:20 localhost python3.9[259723]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:21 localhost python3.9[259809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579879.8442059-539-95077197649537/.source follow=False _original_basename=haproxy.j2 checksum=eddfecb822bb60e7241db0fd719c7552d2d25452 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:21 localhost python3.9[259917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:22 localhost python3.9[260003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579881.1687446-539-14955454410103/.source follow=False _original_basename=dnsmasq.j2 checksum=a6b8b2fb47e7419d250eaee9e3565b13fff8f42e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:22 localhost python3.9[260111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:23 localhost python3.9[260166]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:23 localhost python3.9[260274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:24 localhost python3.9[260360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579883.2872195-626-97379393248581/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:31:25 localhost systemd[1]: tmp-crun.Va7dpx.mount: Deactivated successfully. Feb 20 04:31:25 localhost podman[260378]: 2026-02-20 09:31:25.149012584 +0000 UTC m=+0.080846504 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:31:25 localhost podman[260378]: 2026-02-20 09:31:25.183925277 +0000 UTC m=+0.115759127 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:31:25 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:31:25 localhost nova_compute[230552]: 2026-02-20 09:31:25.280 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:25 localhost nova_compute[230552]: 2026-02-20 09:31:25.282 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:25 localhost nova_compute[230552]: 2026-02-20 09:31:25.283 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:31:25 localhost nova_compute[230552]: 2026-02-20 09:31:25.283 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:25 localhost nova_compute[230552]: 2026-02-20 09:31:25.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:25 localhost nova_compute[230552]: 2026-02-20 09:31:25.315 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:25 localhost python3.9[260491]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:31:26 localhost openstack_network_exporter[244414]: ERROR 09:31:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:31:26 localhost openstack_network_exporter[244414]: Feb 20 04:31:26 localhost openstack_network_exporter[244414]: ERROR 09:31:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:31:26 localhost openstack_network_exporter[244414]: Feb 20 04:31:27 localhost python3.9[260603]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:31:27 localhost podman[260713]: 2026-02-20 09:31:27.533049207 +0000 UTC m=+0.086785860 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1770267347, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, version=9.7, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal) Feb 20 04:31:27 localhost podman[260713]: 2026-02-20 09:31:27.549002596 +0000 UTC m=+0.102739269 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, config_id=openstack_network_exporter, vcs-type=git) Feb 20 04:31:27 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:31:27 localhost python3.9[260714]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:28 localhost python3.9[260790]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:28 localhost python3.9[260900]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:29 localhost python3.9[260957]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:29 localhost python3.9[261067]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:31:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:31:30 localhost podman[261144]: 2026-02-20 09:31:30.131089593 +0000 UTC m=+0.069799758 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 20 04:31:30 localhost podman[261144]: 2026-02-20 09:31:30.136465992 +0000 UTC m=+0.075176147 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:31:30 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:31:30 localhost podman[261141]: 2026-02-20 09:31:30.192549858 +0000 UTC m=+0.130740486 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:31:30 localhost podman[261141]: 2026-02-20 09:31:30.235343099 +0000 UTC m=+0.173533757 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:31:30 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:31:30 localhost nova_compute[230552]: 2026-02-20 09:31:30.313 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:30 localhost python3.9[261203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23771 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D91D40000000001030307) Feb 20 04:31:30 localhost python3.9[261276]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:31 localhost python3.9[261386]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23772 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D95E80000000001030307) Feb 20 04:31:31 localhost python3.9[261443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:32 localhost sshd[261515]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:31:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44632 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D99680000000001030307) Feb 20 04:31:32 localhost python3.9[261555]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:31:32 localhost systemd[1]: Reloading. Feb 20 04:31:32 localhost systemd-rc-local-generator[261579]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:31:32 localhost systemd-sysv-generator[261585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:33 localhost python3.9[261702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23773 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D9DE80000000001030307) Feb 20 04:31:34 localhost python3.9[261759]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24053 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599DA1680000000001030307) Feb 20 04:31:34 localhost python3.9[261869]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:35 localhost nova_compute[230552]: 2026-02-20 09:31:35.316 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:35 localhost python3.9[261926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:36 localhost python3.9[262036]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:31:36 localhost systemd[1]: Reloading. Feb 20 04:31:36 localhost systemd-rc-local-generator[262061]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:31:36 localhost systemd-sysv-generator[262064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:36 localhost systemd[1]: Starting Create netns directory... Feb 20 04:31:36 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 20 04:31:36 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 20 04:31:36 localhost systemd[1]: Finished Create netns directory. Feb 20 04:31:37 localhost sshd[262097]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:31:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23774 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599DADA80000000001030307) Feb 20 04:31:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:31:38 localhost podman[262143]: 2026-02-20 09:31:38.152518464 +0000 UTC m=+0.086784259 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute) Feb 20 04:31:38 localhost podman[262143]: 2026-02-20 09:31:38.167190015 +0000 UTC m=+0.101455800 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 20 04:31:38 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:31:38 localhost python3.9[262210]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:39 localhost python3.9[262320]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:31:39 localhost python3.9[262430]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:40 localhost python3.9[262518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579899.3380752-1094-67246515093766/.source.json _original_basename=.b13xauaa follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:40 localhost nova_compute[230552]: 2026-02-20 09:31:40.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:40 localhost nova_compute[230552]: 2026-02-20 09:31:40.321 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:40 localhost nova_compute[230552]: 2026-02-20 09:31:40.321 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:31:40 localhost nova_compute[230552]: 2026-02-20 09:31:40.321 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:40 localhost nova_compute[230552]: 2026-02-20 09:31:40.356 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:40 localhost nova_compute[230552]: 2026-02-20 09:31:40.356 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:41 localhost python3.9[262626]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:43 localhost python3.9[262930]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Feb 20 04:31:44 localhost python3.9[263040]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:31:45 localhost python3[263150]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:31:45 localhost nova_compute[230552]: 2026-02-20 09:31:45.358 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:45 localhost nova_compute[230552]: 2026-02-20 09:31:45.359 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:45 localhost nova_compute[230552]: 2026-02-20 09:31:45.360 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:31:45 localhost nova_compute[230552]: 2026-02-20 09:31:45.360 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:45 localhost nova_compute[230552]: 2026-02-20 09:31:45.410 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:45 localhost nova_compute[230552]: 2026-02-20 09:31:45.411 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:31:45 localhost podman[263186]: Feb 20 04:31:45 localhost podman[263186]: 2026-02-20 09:31:45.592065707 +0000 UTC m=+0.066922577 container create 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, config_id=neutron_dhcp, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Feb 20 04:31:45 localhost podman[263186]: 2026-02-20 09:31:45.559977352 +0000 UTC m=+0.034834222 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:31:45 localhost python3[263150]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:31:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23775 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599DCD690000000001030307) Feb 20 04:31:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:31:46 localhost podman[263279]: 2026-02-20 09:31:46.164862701 +0000 UTC m=+0.092294102 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:31:46 localhost podman[263279]: 2026-02-20 09:31:46.180004615 +0000 UTC m=+0.107436036 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:31:46 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:31:46 localhost python3.9[263356]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:31:47 localhost python3.9[263468]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:47 localhost podman[241968]: time="2026-02-20T09:31:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:31:47 localhost podman[241968]: @ - - [20/Feb/2026:09:31:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149681 "" "Go-http-client/1.1" Feb 20 04:31:47 localhost podman[241968]: @ - - [20/Feb/2026:09:31:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16662 "" "Go-http-client/1.1" Feb 20 04:31:48 localhost python3.9[263523]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:31:48 localhost python3.9[263632]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579908.2732587-1328-186789700356022/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:49 localhost python3.9[263687]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:31:49 localhost systemd[1]: Reloading. Feb 20 04:31:49 localhost systemd-sysv-generator[263712]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:31:49 localhost systemd-rc-local-generator[263708]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:50 localhost python3.9[263778]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:31:50 localhost nova_compute[230552]: 2026-02-20 09:31:50.411 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:50 localhost nova_compute[230552]: 2026-02-20 09:31:50.418 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:31:51 localhost systemd[1]: Reloading. Feb 20 04:31:51 localhost systemd-rc-local-generator[263805]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:31:51 localhost systemd-sysv-generator[263811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:31:51 localhost systemd[1]: Starting neutron_dhcp_agent container... Feb 20 04:31:51 localhost systemd[1]: Started libcrun container. Feb 20 04:31:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 20 04:31:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:31:51 localhost podman[263819]: 2026-02-20 09:31:51.981606419 +0000 UTC m=+0.124749130 container init 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 20 04:31:51 localhost podman[263819]: 2026-02-20 09:31:51.993206332 +0000 UTC m=+0.136349093 container start 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=neutron_dhcp, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:31:51 localhost podman[263819]: neutron_dhcp_agent Feb 20 04:31:51 localhost neutron_dhcp_agent[263834]: + sudo -E kolla_set_configs Feb 20 04:31:51 localhost systemd[1]: Started neutron_dhcp_agent container. Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Validating config file Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Copying service configuration files Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Writing out command to execute Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: ++ cat /run_command Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + CMD=/usr/bin/neutron-dhcp-agent Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + ARGS= Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + sudo kolla_copy_cacerts Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + [[ ! -n '' ]] Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + . kolla_extend_start Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: Running command: '/usr/bin/neutron-dhcp-agent' Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + umask 0022 Feb 20 04:31:52 localhost neutron_dhcp_agent[263834]: + exec /usr/bin/neutron-dhcp-agent Feb 20 04:31:52 localhost python3.9[263956]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:31:53 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.274 263838 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 20 04:31:53 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.274 263838 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44#033[00m Feb 20 04:31:53 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.636 263838 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 20 04:31:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:31:53.908 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:31:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:31:53.910 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:31:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:31:53.911 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:31:53 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.911 263838 INFO neutron.agent.dhcp.agent [None req-bf66d6ec-c731-433c-b729-240ac9f8124c - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:31:53 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.911 263838 INFO neutron.agent.dhcp.agent [None req-bf66d6ec-c731-433c-b729-240ac9f8124c - - - - - -] Synchronizing state complete#033[00m Feb 20 04:31:53 localhost nova_compute[230552]: 2026-02-20 09:31:53.940 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:53 localhost python3.9[264067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:31:53 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.994 263838 INFO neutron.agent.dhcp.agent [None req-bf66d6ec-c731-433c-b729-240ac9f8124c - - - - - -] DHCP agent started#033[00m Feb 20 04:31:54 localhost python3.9[264157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579913.429464-1463-94152542863959/.source.yaml _original_basename=.4a29isd8 follow=False checksum=b9ca88bcb32671aca7ddecc5a041bae0cf925d73 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:31:55 localhost python3.9[264267]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:31:55 localhost systemd[1]: Stopping neutron_dhcp_agent container... Feb 20 04:31:55 localhost nova_compute[230552]: 2026-02-20 09:31:55.449 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:31:55 localhost podman[264269]: 2026-02-20 09:31:55.471514275 +0000 UTC m=+0.126499525 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:31:55 localhost podman[264269]: 2026-02-20 09:31:55.550757497 +0000 UTC m=+0.205742817 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:31:55 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:31:56 localhost neutron_dhcp_agent[263834]: 2026-02-20 09:31:56.061 263838 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 20 04:31:56 localhost systemd[1]: libpod-0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19.scope: Deactivated successfully. Feb 20 04:31:56 localhost systemd[1]: libpod-0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19.scope: Consumed 2.011s CPU time. Feb 20 04:31:56 localhost podman[264277]: 2026-02-20 09:31:56.394377314 +0000 UTC m=+1.012619532 container died 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=neutron_dhcp, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:31:56 localhost podman[264277]: 2026-02-20 09:31:56.446625561 +0000 UTC m=+1.064867749 container cleanup 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_dhcp_agent) Feb 20 04:31:56 localhost podman[264277]: neutron_dhcp_agent Feb 20 04:31:56 localhost systemd[1]: var-lib-containers-storage-overlay-bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8-merged.mount: Deactivated successfully. Feb 20 04:31:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19-userdata-shm.mount: Deactivated successfully. Feb 20 04:31:56 localhost podman[264334]: error opening file `/run/crun/0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19/status`: No such file or directory Feb 20 04:31:56 localhost podman[264322]: 2026-02-20 09:31:56.555896704 +0000 UTC m=+0.071099469 container cleanup 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:31:56 localhost podman[264322]: neutron_dhcp_agent Feb 20 04:31:56 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Feb 20 04:31:56 localhost systemd[1]: Stopped neutron_dhcp_agent container. Feb 20 04:31:56 localhost openstack_network_exporter[244414]: ERROR 09:31:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:31:56 localhost openstack_network_exporter[244414]: Feb 20 04:31:56 localhost openstack_network_exporter[244414]: ERROR 09:31:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:31:56 localhost openstack_network_exporter[244414]: Feb 20 04:31:56 localhost systemd[1]: Starting neutron_dhcp_agent container... Feb 20 04:31:56 localhost systemd[1]: Started libcrun container. Feb 20 04:31:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 20 04:31:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:31:56 localhost podman[264336]: 2026-02-20 09:31:56.737571045 +0000 UTC m=+0.118048189 container init 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, tcib_managed=true, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:31:56 localhost podman[264336]: 2026-02-20 09:31:56.746288138 +0000 UTC m=+0.126765282 container start 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:31:56 localhost podman[264336]: neutron_dhcp_agent Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + sudo -E kolla_set_configs Feb 20 04:31:56 localhost systemd[1]: Started neutron_dhcp_agent container. Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Validating config file Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Copying service configuration files Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Writing out command to execute Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: ++ cat /run_command Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + CMD=/usr/bin/neutron-dhcp-agent Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + ARGS= Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + sudo kolla_copy_cacerts Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + [[ ! -n '' ]] Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + . kolla_extend_start Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: Running command: '/usr/bin/neutron-dhcp-agent' Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + umask 0022 Feb 20 04:31:56 localhost neutron_dhcp_agent[264351]: + exec /usr/bin/neutron-dhcp-agent Feb 20 04:31:57 localhost systemd[1]: session-59.scope: Deactivated successfully. Feb 20 04:31:57 localhost systemd[1]: session-59.scope: Consumed 35.188s CPU time. Feb 20 04:31:57 localhost systemd-logind[759]: Session 59 logged out. Waiting for processes to exit. Feb 20 04:31:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:31:57 localhost systemd-logind[759]: Removed session 59. Feb 20 04:31:57 localhost podman[264383]: 2026-02-20 09:31:57.929756212 +0000 UTC m=+0.085489899 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1770267347, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:31:57 localhost podman[264383]: 2026-02-20 09:31:57.945923658 +0000 UTC m=+0.101657375 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter) Feb 20 04:31:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:31:57.955 264355 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 20 04:31:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:31:57.955 264355 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44#033[00m Feb 20 04:31:57 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:31:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.312 264355 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 20 04:31:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.739 264355 INFO neutron.agent.dhcp.agent [None req-165f8f8a-8282-4a41-9f70-f0b00322bdc7 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:31:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.740 264355 INFO neutron.agent.dhcp.agent [None req-165f8f8a-8282-4a41-9f70-f0b00322bdc7 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:31:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.768 264355 INFO neutron.agent.dhcp.agent [None req-165f8f8a-8282-4a41-9f70-f0b00322bdc7 - - - - - -] DHCP agent started#033[00m Feb 20 04:31:59 localhost sshd[264404]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:32:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:32:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:32:00 localhost podman[264406]: 2026-02-20 09:32:00.345152256 +0000 UTC m=+0.103443871 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:32:00 localhost podman[264406]: 2026-02-20 09:32:00.378100219 +0000 UTC m=+0.136391834 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 20 04:32:00 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:32:00 localhost podman[264422]: 2026-02-20 09:32:00.438418649 +0000 UTC m=+0.084700316 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127) Feb 20 04:32:00 localhost nova_compute[230552]: 2026-02-20 09:32:00.452 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:00 localhost nova_compute[230552]: 2026-02-20 09:32:00.454 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:00 localhost nova_compute[230552]: 2026-02-20 09:32:00.454 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:00 localhost nova_compute[230552]: 2026-02-20 09:32:00.455 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:00 localhost nova_compute[230552]: 2026-02-20 09:32:00.486 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:00 localhost nova_compute[230552]: 2026-02-20 09:32:00.486 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:00 localhost podman[264422]: 2026-02-20 09:32:00.53011469 +0000 UTC m=+0.176396317 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller) Feb 20 04:32:00 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:32:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61346 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E07040000000001030307) Feb 20 04:32:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61347 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E0B280000000001030307) Feb 20 04:32:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23776 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E0D680000000001030307) Feb 20 04:32:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61348 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E13280000000001030307) Feb 20 04:32:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44633 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E17680000000001030307) Feb 20 04:32:05 localhost nova_compute[230552]: 2026-02-20 09:32:05.487 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:05 localhost nova_compute[230552]: 2026-02-20 09:32:05.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:05 localhost nova_compute[230552]: 2026-02-20 09:32:05.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:05 localhost nova_compute[230552]: 2026-02-20 09:32:05.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:05 localhost nova_compute[230552]: 2026-02-20 09:32:05.490 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:05 localhost nova_compute[230552]: 2026-02-20 09:32:05.494 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:32:05.995 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:32:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:32:05.995 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:32:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:32:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:32:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61349 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E22E80000000001030307) Feb 20 04:32:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:32:09 localhost podman[264451]: 2026-02-20 09:32:09.137231748 +0000 UTC m=+0.076163687 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 20 04:32:09 localhost podman[264451]: 2026-02-20 09:32:09.150182943 +0000 UTC m=+0.089114892 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:32:09 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:32:10 localhost nova_compute[230552]: 2026-02-20 09:32:10.495 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:10 localhost nova_compute[230552]: 2026-02-20 09:32:10.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:10 localhost nova_compute[230552]: 2026-02-20 09:32:10.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:10 localhost nova_compute[230552]: 2026-02-20 09:32:10.498 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:10 localhost nova_compute[230552]: 2026-02-20 09:32:10.539 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:10 localhost nova_compute[230552]: 2026-02-20 09:32:10.540 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:15 localhost nova_compute[230552]: 2026-02-20 09:32:15.541 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:15 localhost nova_compute[230552]: 2026-02-20 09:32:15.543 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:15 localhost nova_compute[230552]: 2026-02-20 09:32:15.544 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:15 localhost nova_compute[230552]: 2026-02-20 09:32:15.544 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:15 localhost nova_compute[230552]: 2026-02-20 09:32:15.587 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:15 localhost nova_compute[230552]: 2026-02-20 09:32:15.588 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61350 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E43680000000001030307) Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.335 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.335 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.335 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.336 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.336 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.816 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.870 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:32:16 localhost nova_compute[230552]: 2026-02-20 09:32:16.871 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:32:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.078 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.080 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12195MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.080 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.080 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.150 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.150 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.150 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:32:17 localhost systemd[1]: tmp-crun.6h2fVm.mount: Deactivated successfully. Feb 20 04:32:17 localhost podman[264579]: 2026-02-20 09:32:17.157711168 +0000 UTC m=+0.090033132 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:32:17 localhost podman[264579]: 2026-02-20 09:32:17.169208858 +0000 UTC m=+0.101530802 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:32:17 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.188 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.656 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.664 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.684 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.687 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:32:17 localhost nova_compute[230552]: 2026-02-20 09:32:17.687 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:32:17 localhost podman[241968]: time="2026-02-20T09:32:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:32:17 localhost podman[241968]: @ - - [20/Feb/2026:09:32:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 20 04:32:17 localhost podman[241968]: @ - - [20/Feb/2026:09:32:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1" Feb 20 04:32:18 localhost nova_compute[230552]: 2026-02-20 09:32:18.687 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:18 localhost nova_compute[230552]: 2026-02-20 09:32:18.688 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:18 localhost nova_compute[230552]: 2026-02-20 09:32:18.688 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:19 localhost nova_compute[230552]: 2026-02-20 09:32:19.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.295 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.312 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.312 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.588 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.590 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.590 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.591 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.592 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:20 localhost nova_compute[230552]: 2026-02-20 09:32:20.596 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.301 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.301 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.500 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.501 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.501 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.502 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.899 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.914 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:32:21 localhost nova_compute[230552]: 2026-02-20 09:32:21.915 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:32:24 localhost ovn_controller[156798]: 2026-02-20T09:32:24Z|00047|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 20 04:32:25 localhost nova_compute[230552]: 2026-02-20 09:32:25.597 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:25 localhost nova_compute[230552]: 2026-02-20 09:32:25.599 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:25 localhost nova_compute[230552]: 2026-02-20 09:32:25.600 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:25 localhost nova_compute[230552]: 2026-02-20 09:32:25.600 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:25 localhost nova_compute[230552]: 2026-02-20 09:32:25.614 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:25 localhost nova_compute[230552]: 2026-02-20 09:32:25.615 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:32:26 localhost podman[264624]: 2026-02-20 09:32:26.148910688 +0000 UTC m=+0.080190323 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:32:26 localhost podman[264624]: 2026-02-20 09:32:26.158954033 +0000 UTC m=+0.090233678 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:32:26 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:32:26 localhost openstack_network_exporter[244414]: ERROR 09:32:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:32:26 localhost openstack_network_exporter[244414]: Feb 20 04:32:26 localhost openstack_network_exporter[244414]: ERROR 09:32:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:32:26 localhost openstack_network_exporter[244414]: Feb 20 04:32:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:32:28 localhost podman[264648]: 2026-02-20 09:32:28.146105442 +0000 UTC m=+0.080868094 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:32:28 localhost podman[264648]: 2026-02-20 09:32:28.158913714 +0000 UTC m=+0.093676336 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1770267347, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9/ubi-minimal) Feb 20 04:32:28 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:32:30 localhost nova_compute[230552]: 2026-02-20 09:32:30.616 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:30 localhost nova_compute[230552]: 2026-02-20 09:32:30.618 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:30 localhost nova_compute[230552]: 2026-02-20 09:32:30.618 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:30 localhost nova_compute[230552]: 2026-02-20 09:32:30.619 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8307 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E7C350000000001030307) Feb 20 04:32:30 localhost nova_compute[230552]: 2026-02-20 09:32:30.647 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:30 localhost nova_compute[230552]: 2026-02-20 09:32:30.648 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:32:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:32:30 localhost podman[264668]: 2026-02-20 09:32:30.853401541 +0000 UTC m=+0.065219365 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 20 04:32:30 localhost podman[264668]: 2026-02-20 09:32:30.890976327 +0000 UTC m=+0.102794171 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Feb 20 04:32:30 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:32:30 localhost podman[264669]: 2026-02-20 09:32:30.9651085 +0000 UTC m=+0.174575490 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:32:30 localhost podman[264669]: 2026-02-20 09:32:30.974994419 +0000 UTC m=+0.184461479 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:32:30 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:32:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8308 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E80290000000001030307) Feb 20 04:32:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61351 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E83680000000001030307) Feb 20 04:32:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8309 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E88280000000001030307) Feb 20 04:32:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23777 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E8B680000000001030307) Feb 20 04:32:35 localhost nova_compute[230552]: 2026-02-20 09:32:35.645 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8310 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E97E80000000001030307) Feb 20 04:32:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:32:40 localhost podman[264710]: 2026-02-20 09:32:40.143941787 +0000 UTC m=+0.081882046 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:32:40 localhost podman[264710]: 2026-02-20 09:32:40.154373044 +0000 UTC m=+0.092313293 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:32:40 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:32:40 localhost nova_compute[230552]: 2026-02-20 09:32:40.649 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:40 localhost nova_compute[230552]: 2026-02-20 09:32:40.651 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:40 localhost nova_compute[230552]: 2026-02-20 09:32:40.651 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:40 localhost nova_compute[230552]: 2026-02-20 09:32:40.652 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:40 localhost nova_compute[230552]: 2026-02-20 09:32:40.669 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:40 localhost nova_compute[230552]: 2026-02-20 09:32:40.670 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:45 localhost nova_compute[230552]: 2026-02-20 09:32:45.671 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:45 localhost nova_compute[230552]: 2026-02-20 09:32:45.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:45 localhost nova_compute[230552]: 2026-02-20 09:32:45.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:45 localhost nova_compute[230552]: 2026-02-20 09:32:45.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:45 localhost nova_compute[230552]: 2026-02-20 09:32:45.714 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:45 localhost nova_compute[230552]: 2026-02-20 09:32:45.714 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8311 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EB7680000000001030307) Feb 20 04:32:47 localhost podman[241968]: time="2026-02-20T09:32:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:32:47 localhost podman[241968]: @ - - [20/Feb/2026:09:32:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 20 04:32:47 localhost podman[241968]: @ - - [20/Feb/2026:09:32:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16786 "" "Go-http-client/1.1" Feb 20 04:32:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:32:48 localhost systemd[1]: tmp-crun.b1T5dK.mount: Deactivated successfully. Feb 20 04:32:48 localhost podman[264728]: 2026-02-20 09:32:48.132719044 +0000 UTC m=+0.072417760 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:32:48 localhost podman[264728]: 2026-02-20 09:32:48.141320153 +0000 UTC m=+0.081018859 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:32:48 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:32:50 localhost nova_compute[230552]: 2026-02-20 09:32:50.715 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:50 localhost nova_compute[230552]: 2026-02-20 09:32:50.717 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:50 localhost nova_compute[230552]: 2026-02-20 09:32:50.717 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:50 localhost nova_compute[230552]: 2026-02-20 09:32:50.718 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:50 localhost nova_compute[230552]: 2026-02-20 09:32:50.749 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:50 localhost nova_compute[230552]: 2026-02-20 09:32:50.750 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:52 localhost sshd[264751]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:32:55 localhost nova_compute[230552]: 2026-02-20 09:32:55.751 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:55 localhost nova_compute[230552]: 2026-02-20 09:32:55.753 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:32:55 localhost nova_compute[230552]: 2026-02-20 09:32:55.754 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:32:55 localhost nova_compute[230552]: 2026-02-20 09:32:55.754 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:55 localhost nova_compute[230552]: 2026-02-20 09:32:55.790 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:32:55 localhost nova_compute[230552]: 2026-02-20 09:32:55.791 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:32:56 localhost openstack_network_exporter[244414]: ERROR 09:32:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:32:56 localhost openstack_network_exporter[244414]: Feb 20 04:32:56 localhost openstack_network_exporter[244414]: ERROR 09:32:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:32:56 localhost openstack_network_exporter[244414]: Feb 20 04:32:56 localhost sshd[264753]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:32:57 localhost systemd-logind[759]: New session 60 of user zuul. Feb 20 04:32:57 localhost systemd[1]: Started Session 60 of User zuul. Feb 20 04:32:57 localhost systemd[1]: tmp-crun.H7KjML.mount: Deactivated successfully. Feb 20 04:32:57 localhost podman[264755]: 2026-02-20 09:32:57.152838141 +0000 UTC m=+0.091137306 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:32:57 localhost podman[264755]: 2026-02-20 09:32:57.16397661 +0000 UTC m=+0.102275785 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:32:57 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:32:58 localhost python3.9[264885]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:32:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:32:59 localhost podman[264923]: 2026-02-20 09:32:59.141727906 +0000 UTC m=+0.074872327 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:32:59 localhost podman[264923]: 2026-02-20 09:32:59.158043196 +0000 UTC m=+0.091187587 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, distribution-scope=public, version=9.7, vendor=Red Hat, Inc., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:32:59 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:32:59 localhost python3.9[265016]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:32:59 localhost network[265033]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:32:59 localhost network[265034]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:32:59 localhost network[265035]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:33:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58671 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EF1640000000001030307) Feb 20 04:33:00 localhost nova_compute[230552]: 2026-02-20 09:33:00.792 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:00 localhost nova_compute[230552]: 2026-02-20 09:33:00.794 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:00 localhost nova_compute[230552]: 2026-02-20 09:33:00.794 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:00 localhost nova_compute[230552]: 2026-02-20 09:33:00.794 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:00 localhost nova_compute[230552]: 2026-02-20 09:33:00.834 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:00 localhost nova_compute[230552]: 2026-02-20 09:33:00.835 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:33:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:33:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:33:01 localhost podman[265075]: 2026-02-20 09:33:01.049154248 +0000 UTC m=+0.091828707 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:33:01 localhost podman[265089]: 2026-02-20 09:33:01.113737562 +0000 UTC m=+0.085279224 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:33:01 localhost podman[265075]: 2026-02-20 09:33:01.133721347 +0000 UTC m=+0.176395846 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:33:01 localhost podman[265089]: 2026-02-20 09:33:01.144269358 +0000 UTC m=+0.115810980 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true) Feb 20 04:33:01 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:33:01 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:33:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58672 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EF5680000000001030307) Feb 20 04:33:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8312 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EF7680000000001030307) Feb 20 04:33:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58673 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EFD690000000001030307) Feb 20 04:33:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61352 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F01680000000001030307) Feb 20 04:33:05 localhost nova_compute[230552]: 2026-02-20 09:33:05.836 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:05 localhost nova_compute[230552]: 2026-02-20 09:33:05.839 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:05 localhost nova_compute[230552]: 2026-02-20 09:33:05.839 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:05 localhost nova_compute[230552]: 2026-02-20 09:33:05.839 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:05 localhost nova_compute[230552]: 2026-02-20 09:33:05.879 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:05 localhost nova_compute[230552]: 2026-02-20 09:33:05.879 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:33:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:33:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:33:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:33:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:33:05.998 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:33:06 localhost python3.9[265312]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 20 04:33:07 localhost python3.9[265375]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:33:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58674 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F0D280000000001030307) Feb 20 04:33:10 localhost nova_compute[230552]: 2026-02-20 09:33:10.880 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:10 localhost nova_compute[230552]: 2026-02-20 09:33:10.882 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:10 localhost nova_compute[230552]: 2026-02-20 09:33:10.883 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:10 localhost nova_compute[230552]: 2026-02-20 09:33:10.883 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:10 localhost nova_compute[230552]: 2026-02-20 09:33:10.911 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:10 localhost nova_compute[230552]: 2026-02-20 09:33:10.912 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:33:11 localhost podman[265466]: 2026-02-20 09:33:11.163971748 +0000 UTC m=+0.095095430 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:33:11 localhost podman[265466]: 2026-02-20 09:33:11.208100671 +0000 UTC m=+0.139224343 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 20 04:33:11 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:33:11 localhost python3.9[265506]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:33:12 localhost python3.9[265617]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:33:14 localhost python3.9[265728]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:33:15 localhost python3.9[265876]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:15 localhost nova_compute[230552]: 2026-02-20 09:33:15.937 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58675 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F2D680000000001030307) Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.353 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.355 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.356 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.356 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.357 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:33:16 localhost nova_compute[230552]: 2026-02-20 09:33:16.830 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:33:16 localhost python3.9[266095]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.016 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.017 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.203 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.204 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12150MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.204 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.205 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.304 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.305 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.305 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.348 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:33:17 localhost podman[241968]: time="2026-02-20T09:33:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:33:17 localhost podman[241968]: @ - - [20/Feb/2026:09:33:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 20 04:33:17 localhost podman[241968]: @ - - [20/Feb/2026:09:33:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1" Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.842 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.851 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:33:17 localhost python3.9[266247]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.866 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.868 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:33:17 localhost nova_compute[230552]: 2026-02-20 09:33:17.869 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.202 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.207 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 93 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5be686c-ccca-4abf-802e-52fead763937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 93, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.204273', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '305952f2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'c984a7f767a135ff181801d95ea22426530434aa279863fbef10421406d5064b'}]}, 'timestamp': '2026-02-20 09:33:18.208851', '_unique_id': 'b233ebc65907420cb30ad19525dcce4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fea40532-8df8-40a3-90a7-9f1cc54e7211', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.211691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '305f90fe-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '182272a1572c9d36e179103ff6a542b3fa86b75d1a6eaa2c3a38de6036e73e7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.211691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '305fa670-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'e4274fd0e753dee990e005c22c96c244766eb9719a36e8fb80aebc263f6dd8db'}]}, 'timestamp': '2026-02-20 09:33:18.250166', '_unique_id': 'd3201ef5809842e5b076f11fe4385805'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50cd560c-3c5d-4453-9a95-4196af5eebcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.253030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306028b6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'eab1c20a5e6df18bbe7a4c1ddb3f018505bdaeefecb140d385f26a7d968371d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.253030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '30603a9a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '0bda803e95b23cb5192c856143e8f03ae39f0076d8b3613f387919dc268be7ed'}]}, 'timestamp': '2026-02-20 09:33:18.253947', '_unique_id': 'd2a5feab5ba44e83922652a75b1800c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72b8f962-c07a-4c18-b6ef-1068480f2cd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.256217', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '3060a520-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '540bf68fb6ce6970cd2c452d08bdca89109ca10936f9d168194e6e70361ef0fe'}]}, 'timestamp': '2026-02-20 09:33:18.256743', '_unique_id': '703d54ef39e14937baa8047a0965c455'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2f825e1-6d0e-4030-8e63-2da35cb1cb6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:33:18.258927', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '30641e94-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.518038794, 'message_signature': '58be7d3e722b4d98a3a486ca94a2f3c8c23c56a2171d42c36975b5d36996d743'}]}, 'timestamp': '2026-02-20 09:33:18.279462', '_unique_id': 'b3f417aeeabf406e9010a3d3b479e3b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd264440a-0741-465b-b055-82f47fd397da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.281707', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '30648884-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'de1e408e8fbafbee57093afa7d3dce5af375d58d40c39c4604e6b642d74c6dcb'}]}, 'timestamp': '2026-02-20 09:33:18.282184', '_unique_id': '946dbee8e6384ba290cb0a7e3bfd5595'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7baa03fa-6fde-456d-a84c-24789f32d7fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.284405', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '3064f2d8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '1a273062535fa3d1edf4a283b3028ee56710221636cc44756030902a5f69c68e'}]}, 'timestamp': '2026-02-20 09:33:18.284909', '_unique_id': '40f981c3aecc4e259d0c8209e87128d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 61130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '946d53bb-463c-4f44-b2c6-edd8a6de83a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 61130000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:33:18.287052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '306558b8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.518038794, 'message_signature': '5f330bb91c7ef305285fb1cdfe805420c308b4f3837bd0c24f443ab0dd5bf3e3'}]}, 'timestamp': '2026-02-20 09:33:18.287496', '_unique_id': '260b382542b84f1da57adf0dc37dff57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.289 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93459998-0cb0-4ed8-8cca-d3fb5e0805a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.290129', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '3065d1b2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'a8af50ad4be9ca208485529fea8254215654d5c333a97e9ddcc515450fc23192'}]}, 'timestamp': '2026-02-20 09:33:18.290609', '_unique_id': '93433b16cd6f4be0934a1b401af68a7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb83d1e-c5c1-43c0-9ca6-f897377e97a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.292790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306638f0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'd7d6a6b4506a23d7cb3820b4d818a83b3bb0b5df26538b8c5d360cd900a43a68'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.292790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306649b2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '9e677835955acf553570d54394cfc17ce1cd577cdc75acfe3e15f3405cdd7e1e'}]}, 'timestamp': '2026-02-20 09:33:18.293678', '_unique_id': 'c56bad40d7894feeb6426da77475f8e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94509075-082f-4de0-b8b9-eebcd1f55eac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.295975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3066b578-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '767cc8ce06aa9e8150c94f99f814d074fbaad38a7ba8b30973466568c3225c89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.295975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3066c5f4-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '4f9da4db6dd42a012d3276ef51311edea6dae254d604089c153ceb4eaec784b1'}]}, 'timestamp': '2026-02-20 09:33:18.296860', '_unique_id': '4bf1b26eb3df465ba4af31a1d5fcd91b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b73d3d59-25de-4e78-a246-e701bccbfc11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.299012', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '30672c38-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'bbf995a2d3adadbaba20923a961ba26bb3c55e2ab7764c97116e116503dc1d90'}]}, 'timestamp': '2026-02-20 09:33:18.299479', '_unique_id': '6e16ba6fbc42499ea8c07d86c4da093b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ff73fba-3ca0-4de8-a06e-b54190abc7bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.301838', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3069e8e2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '1e4951343a73009f8e713a7777fbd031e201d22d2297f7a6a50283873781676e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.301838', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3069fc1a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '5c5ec7b5ac4b07e7abaa35d9357d77edbe79e5e9f9ae108e9f55a058c52ce0a3'}]}, 'timestamp': '2026-02-20 09:33:18.317895', '_unique_id': 'e95298657fe34a51902bdac39799e829'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c599811-144e-4842-badb-9df62c63433a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.320215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306a689e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '35f94e474f2504c3415de75f47f1a1889dcf0eedcbb14297fea8ab1ac007adc3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.320215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306a7abe-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': 'e284ae35b6fe3de4098a39353b19dea92fc177ab5b380ab6a065ba3cd67ed866'}]}, 'timestamp': '2026-02-20 09:33:18.321128', '_unique_id': '0b509875283a438da1a63079193ed2b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.323 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39a59869-206d-4ab7-b179-db2a8a802850', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.323347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306ae2ce-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '755283869f54fb5a096f8405c593ab53744fa1cdb9a523eca318f0404532ce35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.323347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306af6c4-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '9ac27200c0a449951e20650aaa25fe76719d76f2615c747086c8e06cfb16fdbb'}]}, 'timestamp': '2026-02-20 09:33:18.324301', '_unique_id': 'c1d1c7f7eca3469c9fa17765dca573f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6b654ba-21ae-416f-9941-3110d8bbfb2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.326670', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306b6596-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'c7ff80cb2ef4c0beca3687363bcb8f6c4fa15d4f8db2116bb0361e32b143416e'}]}, 'timestamp': '2026-02-20 09:33:18.327162', '_unique_id': 'f5582a73788045f4aa859464352b9c8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '203b8679-49e7-4432-be39-8cf2bc7e6346', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.329339', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306bcce8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '467bb59919beb077614727815768d1d917ecfd6e2a60de7c1301d1b56699c377'}]}, 'timestamp': '2026-02-20 09:33:18.329844', '_unique_id': '0f390656fb4b41bd8feab2407ff4fa70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4169477-f5e1-4abd-8ab7-51045d60d99a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.332036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306c36d8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '1279598e86ebedd13f794bc43a63a36b99e56f71d9ffbcb5f50a9e20ded83549'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.332036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306c48c6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': 'a56d213f62dd778c114b189d76dc082e4e4c0b11bd66213ebee9b430934cb073'}]}, 'timestamp': '2026-02-20 09:33:18.332953', '_unique_id': '42d223afb20243c290e14576bd8107f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17ba3e1d-9732-495d-8461-7031c2abc42e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.335157', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306cb068-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '5abfbe06d062a269960ba5ab481c09c75e5e889ac31c531208bcbfc15505fac1'}]}, 'timestamp': '2026-02-20 09:33:18.335664', '_unique_id': '160311d14912418da848f8428fa015ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.337 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a7b48f7-3cda-4516-9bb1-5d3c31dcfd20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.337828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306d185a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '432b10e0016047247f95394e36fd3c6e4baba78bdd2a7807e8fbbbf0b1187f13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.337828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306d293a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'e8d805505b9bc88577a245421d7fe8b7b6d93f1771a5524f1fdf71acbda7c9b1'}]}, 'timestamp': '2026-02-20 09:33:18.338736', '_unique_id': '440f4b798b9545a694ef3ebbaf875bf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 9437 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81204abe-78c0-43f6-993a-c4b24b9fae7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9437, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.340958', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306d92f8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'cf4a60eebe80ac8d8bff8da9c914ad9e72564810dd1b5385780bf23b6a623996'}]}, 'timestamp': '2026-02-20 09:33:18.341436', '_unique_id': 'f3250cd36c994224b01220f36e2862de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:33:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Feb 20 04:33:18 localhost nova_compute[230552]: 2026-02-20 09:33:18.869 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:18 localhost nova_compute[230552]: 2026-02-20 09:33:18.870 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:18 localhost nova_compute[230552]: 2026-02-20 09:33:18.870 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:19 localhost python3.9[266359]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:33:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:33:19 localhost network[266382]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:33:19 localhost network[266383]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:33:19 localhost network[266384]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:33:19 localhost podman[266368]: 2026-02-20 09:33:19.157940157 +0000 UTC m=+0.086471248 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:33:19 localhost podman[266368]: 2026-02-20 09:33:19.168982088 +0000 UTC m=+0.097513149 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:33:19 localhost nova_compute[230552]: 2026-02-20 09:33:19.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:19 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.295 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.940 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.942 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.942 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.942 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.985 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:20 localhost nova_compute[230552]: 2026-02-20 09:33:20.986 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:22 localhost nova_compute[230552]: 2026-02-20 09:33:22.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:22 localhost nova_compute[230552]: 2026-02-20 09:33:22.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.374 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.374 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.374 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:33:23 localhost nova_compute[230552]: 2026-02-20 09:33:23.375 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:33:23 localhost python3.9[266632]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:33:24 localhost nova_compute[230552]: 2026-02-20 09:33:24.072 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:33:24 localhost nova_compute[230552]: 2026-02-20 09:33:24.092 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:33:24 localhost nova_compute[230552]: 2026-02-20 09:33:24.092 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:33:25 localhost nova_compute[230552]: 2026-02-20 09:33:25.988 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:25 localhost nova_compute[230552]: 2026-02-20 09:33:25.990 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:25 localhost nova_compute[230552]: 2026-02-20 09:33:25.991 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:25 localhost nova_compute[230552]: 2026-02-20 09:33:25.991 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:25 localhost nova_compute[230552]: 2026-02-20 09:33:25.992 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:25 localhost nova_compute[230552]: 2026-02-20 09:33:25.993 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:26 localhost openstack_network_exporter[244414]: ERROR 09:33:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:33:26 localhost openstack_network_exporter[244414]: Feb 20 04:33:26 localhost openstack_network_exporter[244414]: ERROR 09:33:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:33:26 localhost openstack_network_exporter[244414]: Feb 20 04:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:33:28 localhost systemd[1]: tmp-crun.c7tFEi.mount: Deactivated successfully. Feb 20 04:33:28 localhost podman[266706]: 2026-02-20 09:33:28.166557879 +0000 UTC m=+0.099867333 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:33:28 localhost podman[266706]: 2026-02-20 09:33:28.181012456 +0000 UTC m=+0.114321940 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:33:28 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:33:28 localhost python3.9[266766]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 20 04:33:29 localhost python3.9[266876]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Feb 20 04:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:33:29 localhost podman[266987]: 2026-02-20 09:33:29.955105465 +0000 UTC m=+0.079126620 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 20 04:33:29 localhost podman[266987]: 2026-02-20 09:33:29.972140162 +0000 UTC m=+0.096161357 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, name=ubi9/ubi-minimal, release=1770267347, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:33:29 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:33:30 localhost python3.9[266986]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:33:30 localhost sshd[267062]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:33:30 localhost python3.9[267061]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18810 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F66950000000001030307) Feb 20 04:33:30 localhost nova_compute[230552]: 2026-02-20 09:33:30.994 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:30 localhost nova_compute[230552]: 2026-02-20 09:33:30.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:30 localhost nova_compute[230552]: 2026-02-20 09:33:30.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:30 localhost nova_compute[230552]: 2026-02-20 09:33:30.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:31 localhost nova_compute[230552]: 2026-02-20 09:33:31.021 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:31 localhost nova_compute[230552]: 2026-02-20 09:33:31.022 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:33:31 localhost podman[267175]: 2026-02-20 09:33:31.315709964 +0000 UTC m=+0.100695768 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Feb 20 04:33:31 localhost podman[267173]: 2026-02-20 09:33:31.342203214 +0000 UTC m=+0.127057544 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:33:31 localhost podman[267175]: 2026-02-20 09:33:31.353170643 +0000 UTC m=+0.138156457 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:33:31 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:33:31 localhost python3.9[267174]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:31 localhost podman[267173]: 2026-02-20 09:33:31.413110999 +0000 UTC m=+0.197965279 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller) Feb 20 04:33:31 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:33:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18811 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F6AA80000000001030307) Feb 20 04:33:32 localhost python3.9[267327]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:33:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58676 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F6D690000000001030307) Feb 20 04:33:32 localhost python3.9[267438]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:33:33 localhost python3.9[267549]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:33:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18812 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F72A90000000001030307) Feb 20 04:33:34 localhost python3.9[267661]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:33:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8313 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F75690000000001030307) Feb 20 04:33:35 localhost python3.9[267772]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:36 localhost python3.9[267882]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:36 localhost nova_compute[230552]: 2026-02-20 09:33:36.023 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:36 localhost nova_compute[230552]: 2026-02-20 09:33:36.024 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:36 localhost nova_compute[230552]: 2026-02-20 09:33:36.025 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:36 localhost nova_compute[230552]: 2026-02-20 09:33:36.025 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:36 localhost nova_compute[230552]: 2026-02-20 09:33:36.056 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:36 localhost nova_compute[230552]: 2026-02-20 09:33:36.057 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:36 localhost python3.9[267992]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:37 localhost python3.9[268102]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18813 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F82680000000001030307) Feb 20 04:33:37 localhost python3.9[268212]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:38 localhost python3.9[268322]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:33:39 localhost python3.9[268434]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:33:40 localhost python3.9[268546]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:33:41 localhost nova_compute[230552]: 2026-02-20 09:33:41.057 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:33:42 localhost systemd[1]: tmp-crun.dKL4Yr.mount: Deactivated successfully. Feb 20 04:33:42 localhost podman[268612]: 2026-02-20 09:33:42.152627103 +0000 UTC m=+0.093856626 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:33:42 localhost podman[268612]: 2026-02-20 09:33:42.166983298 +0000 UTC m=+0.108212831 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute) Feb 20 04:33:42 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:33:42 localhost python3.9[268677]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 20 04:33:43 localhost python3.9[268787]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Feb 20 04:33:43 localhost python3.9[268897]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:33:44 localhost python3.9[268954]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:45 localhost python3.9[269064]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:46 localhost nova_compute[230552]: 2026-02-20 09:33:46.059 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:46 localhost nova_compute[230552]: 2026-02-20 09:33:46.062 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:46 localhost nova_compute[230552]: 2026-02-20 09:33:46.062 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:46 localhost nova_compute[230552]: 2026-02-20 09:33:46.062 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:46 localhost nova_compute[230552]: 2026-02-20 09:33:46.106 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:46 localhost nova_compute[230552]: 2026-02-20 09:33:46.107 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18814 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FA3680000000001030307) Feb 20 04:33:46 localhost python3.9[269174]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 20 04:33:47 localhost podman[241968]: time="2026-02-20T09:33:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:33:47 localhost podman[241968]: @ - - [20/Feb/2026:09:33:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 20 04:33:47 localhost podman[241968]: @ - - [20/Feb/2026:09:33:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16794 "" "Go-http-client/1.1" Feb 20 04:33:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:33:50 localhost systemd[1]: tmp-crun.GELx2G.mount: Deactivated successfully. Feb 20 04:33:50 localhost podman[269194]: 2026-02-20 09:33:50.154732127 +0000 UTC m=+0.091432111 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:33:50 localhost podman[269194]: 2026-02-20 09:33:50.188761351 +0000 UTC m=+0.125461305 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:33:50 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:33:50 localhost python3.9[269308]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 20 04:33:51 localhost nova_compute[230552]: 2026-02-20 09:33:51.108 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:51 localhost nova_compute[230552]: 2026-02-20 09:33:51.110 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:51 localhost nova_compute[230552]: 2026-02-20 09:33:51.110 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:51 localhost nova_compute[230552]: 2026-02-20 09:33:51.110 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:51 localhost nova_compute[230552]: 2026-02-20 09:33:51.150 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:51 localhost nova_compute[230552]: 2026-02-20 09:33:51.151 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:51 localhost python3.9[269422]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:33:52 localhost python3.9[269532]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:33:52 localhost systemd[1]: Reloading. Feb 20 04:33:52 localhost systemd-rc-local-generator[269558]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:33:52 localhost systemd-sysv-generator[269564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:33:53 localhost python3.9[269676]: ansible-ansible.builtin.service_facts Invoked Feb 20 04:33:54 localhost network[269693]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 20 04:33:54 localhost network[269694]: 'network-scripts' will be removed from distribution in near future. Feb 20 04:33:54 localhost network[269695]: It is advised to switch to 'NetworkManager' instead for network management. Feb 20 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:33:56 localhost nova_compute[230552]: 2026-02-20 09:33:56.153 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:56 localhost nova_compute[230552]: 2026-02-20 09:33:56.155 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:33:56 localhost nova_compute[230552]: 2026-02-20 09:33:56.155 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:33:56 localhost nova_compute[230552]: 2026-02-20 09:33:56.156 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:56 localhost nova_compute[230552]: 2026-02-20 09:33:56.185 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:33:56 localhost nova_compute[230552]: 2026-02-20 09:33:56.186 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:33:56 localhost openstack_network_exporter[244414]: ERROR 09:33:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:33:56 localhost openstack_network_exporter[244414]: Feb 20 04:33:56 localhost openstack_network_exporter[244414]: ERROR 09:33:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:33:56 localhost openstack_network_exporter[244414]: Feb 20 04:33:58 localhost sshd[269893]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:33:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:33:58 localhost systemd[1]: tmp-crun.AaLob2.mount: Deactivated successfully. Feb 20 04:33:58 localhost podman[269929]: 2026-02-20 09:33:58.795234014 +0000 UTC m=+0.096814068 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:33:58 localhost podman[269929]: 2026-02-20 09:33:58.812918831 +0000 UTC m=+0.114498845 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:33:58 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:33:59 localhost python3.9[269930]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:33:59 localhost python3.9[270063]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:34:00 localhost systemd[1]: tmp-crun.BG06HK.mount: Deactivated successfully. Feb 20 04:34:00 localhost podman[270174]: 2026-02-20 09:34:00.158720433 +0000 UTC m=+0.091310038 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, version=9.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z) Feb 20 04:34:00 localhost podman[270174]: 2026-02-20 09:34:00.175945746 +0000 UTC m=+0.108535361 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git) Feb 20 04:34:00 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:34:00 localhost python3.9[270180]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59202 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FDBC40000000001030307) Feb 20 04:34:01 localhost python3.9[270307]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:01 localhost nova_compute[230552]: 2026-02-20 09:34:01.187 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:01 localhost nova_compute[230552]: 2026-02-20 09:34:01.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:01 localhost nova_compute[230552]: 2026-02-20 09:34:01.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:01 localhost nova_compute[230552]: 2026-02-20 09:34:01.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:01 localhost nova_compute[230552]: 2026-02-20 09:34:01.223 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:01 localhost nova_compute[230552]: 2026-02-20 09:34:01.224 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:34:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:34:01 localhost podman[270420]: 2026-02-20 09:34:01.562970033 +0000 UTC m=+0.080608997 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:34:01 localhost podman[270420]: 2026-02-20 09:34:01.568973838 +0000 UTC m=+0.086612782 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 20 04:34:01 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:34:01 localhost podman[270419]: 2026-02-20 09:34:01.613482886 +0000 UTC m=+0.132932696 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:34:01 localhost podman[270419]: 2026-02-20 09:34:01.654079063 +0000 UTC m=+0.173528833 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:34:01 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:34:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59203 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FDFE80000000001030307) Feb 20 04:34:01 localhost python3.9[270418]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:02 localhost python3.9[270570]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18815 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FE3680000000001030307) Feb 20 04:34:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59204 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FE7E80000000001030307) Feb 20 04:34:04 localhost python3.9[270681]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58677 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FEB690000000001030307) Feb 20 04:34:04 localhost python3.9[270792]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:34:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:34:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:34:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:34:05.998 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:34:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:34:06.000 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:34:06 localhost nova_compute[230552]: 2026-02-20 09:34:06.224 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:06 localhost nova_compute[230552]: 2026-02-20 09:34:06.226 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:06 localhost nova_compute[230552]: 2026-02-20 09:34:06.227 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:06 localhost nova_compute[230552]: 2026-02-20 09:34:06.227 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:06 localhost nova_compute[230552]: 2026-02-20 09:34:06.256 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:06 localhost nova_compute[230552]: 2026-02-20 09:34:06.256 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:07 localhost python3.9[270903]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59205 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FF7A80000000001030307) Feb 20 04:34:08 localhost python3.9[271013]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:08 localhost python3.9[271123]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:09 localhost python3.9[271233]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:10 localhost python3.9[271343]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:10 localhost python3.9[271453]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:11 localhost nova_compute[230552]: 2026-02-20 09:34:11.257 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:11 localhost nova_compute[230552]: 2026-02-20 09:34:11.259 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:11 localhost nova_compute[230552]: 2026-02-20 09:34:11.259 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:11 localhost nova_compute[230552]: 2026-02-20 09:34:11.259 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:11 localhost nova_compute[230552]: 2026-02-20 09:34:11.260 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:11 localhost nova_compute[230552]: 2026-02-20 09:34:11.263 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:11 localhost python3.9[271563]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:12 localhost python3.9[271673]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:34:12 localhost systemd[1]: tmp-crun.u7w7SS.mount: Deactivated successfully. Feb 20 04:34:12 localhost podman[271784]: 2026-02-20 09:34:12.633227387 +0000 UTC m=+0.095372874 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 20 04:34:12 localhost podman[271784]: 2026-02-20 09:34:12.643930387 +0000 UTC m=+0.106075894 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:34:12 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:34:12 localhost python3.9[271783]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:13 localhost python3.9[271912]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:14 localhost python3.9[272022]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:14 localhost nova_compute[230552]: 2026-02-20 09:34:14.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:14 localhost nova_compute[230552]: 2026-02-20 09:34:14.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 04:34:14 localhost python3.9[272132]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:15 localhost python3.9[272242]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59206 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A017680000000001030307) Feb 20 04:34:16 localhost nova_compute[230552]: 2026-02-20 09:34:16.296 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:16 localhost nova_compute[230552]: 2026-02-20 09:34:16.298 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:16 localhost nova_compute[230552]: 2026-02-20 09:34:16.299 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:16 localhost nova_compute[230552]: 2026-02-20 09:34:16.299 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:16 localhost nova_compute[230552]: 2026-02-20 09:34:16.300 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:16 localhost nova_compute[230552]: 2026-02-20 09:34:16.301 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:16 localhost python3.9[272352]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:17 localhost python3.9[272462]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:17 localhost nova_compute[230552]: 2026-02-20 09:34:17.327 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:17 localhost podman[241968]: time="2026-02-20T09:34:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:34:17 localhost python3.9[272608]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:17 localhost podman[241968]: @ - - [20/Feb/2026:09:34:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 20 04:34:17 localhost podman[241968]: @ - - [20/Feb/2026:09:34:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16797 "" "Go-http-client/1.1" Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.321 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.321 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.321 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.322 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.786 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.844 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:34:18 localhost nova_compute[230552]: 2026-02-20 09:34:18.845 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.062 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.064 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12183MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.064 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.065 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.256 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.258 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.258 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.332 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:34:19 localhost python3.9[272789]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.399 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.400 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.415 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.435 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.470 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.923 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.930 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.946 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.949 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.949 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:34:19 localhost nova_compute[230552]: 2026-02-20 09:34:19.950 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:20 localhost python3.9[272921]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 20 04:34:20 localhost nova_compute[230552]: 2026-02-20 09:34:20.957 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:20 localhost nova_compute[230552]: 2026-02-20 09:34:20.976 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:34:21 localhost podman[273032]: 2026-02-20 09:34:21.092577075 +0000 UTC m=+0.087924853 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:34:21 localhost podman[273032]: 2026-02-20 09:34:21.105540717 +0000 UTC m=+0.100888485 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:34:21 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.302 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.303 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.303 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.304 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.305 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:21 localhost nova_compute[230552]: 2026-02-20 09:34:21.307 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:21 localhost python3.9[273031]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 20 04:34:21 localhost systemd[1]: Reloading. Feb 20 04:34:21 localhost systemd-rc-local-generator[273077]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:34:21 localhost systemd-sysv-generator[273085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:34:22 localhost nova_compute[230552]: 2026-02-20 09:34:22.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:22 localhost nova_compute[230552]: 2026-02-20 09:34:22.302 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:34:22 localhost nova_compute[230552]: 2026-02-20 09:34:22.302 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:22 localhost nova_compute[230552]: 2026-02-20 09:34:22.303 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 04:34:22 localhost nova_compute[230552]: 2026-02-20 09:34:22.315 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 04:34:22 localhost python3.9[273201]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:22 localhost python3.9[273312]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:23 localhost python3.9[273423]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:24 localhost python3.9[273534]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.315 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.316 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.316 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.741 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.742 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.742 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:34:24 localhost nova_compute[230552]: 2026-02-20 09:34:24.742 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:34:24 localhost python3.9[273645]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:25 localhost nova_compute[230552]: 2026-02-20 09:34:25.099 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:34:25 localhost nova_compute[230552]: 2026-02-20 09:34:25.116 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:34:25 localhost nova_compute[230552]: 2026-02-20 09:34:25.116 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:34:25 localhost python3.9[273756]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:26 localhost nova_compute[230552]: 2026-02-20 09:34:26.307 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:26 localhost openstack_network_exporter[244414]: ERROR 09:34:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:34:26 localhost openstack_network_exporter[244414]: Feb 20 04:34:26 localhost openstack_network_exporter[244414]: ERROR 09:34:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:34:26 localhost openstack_network_exporter[244414]: Feb 20 04:34:26 localhost python3.9[273867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:27 localhost python3.9[273978]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:34:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:34:29 localhost podman[273997]: 2026-02-20 09:34:29.137828095 +0000 UTC m=+0.078329756 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:34:29 localhost podman[273997]: 2026-02-20 09:34:29.155202462 +0000 UTC m=+0.095704143 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:34:29 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:34:30 localhost sshd[274019]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:34:30 localhost podman[274021]: 2026-02-20 09:34:30.372763944 +0000 UTC m=+0.079503742 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 20 04:34:30 localhost podman[274021]: 2026-02-20 09:34:30.413557327 +0000 UTC m=+0.120297085 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container) Feb 20 04:34:30 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:34:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32280 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A050F40000000001030307) Feb 20 04:34:31 localhost python3.9[274133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:31 localhost nova_compute[230552]: 2026-02-20 09:34:31.310 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:31 localhost nova_compute[230552]: 2026-02-20 09:34:31.311 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:31 localhost nova_compute[230552]: 2026-02-20 09:34:31.312 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:31 localhost nova_compute[230552]: 2026-02-20 09:34:31.312 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:31 localhost nova_compute[230552]: 2026-02-20 09:34:31.312 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:31 localhost nova_compute[230552]: 2026-02-20 09:34:31.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32281 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A054E80000000001030307) Feb 20 04:34:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:34:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:34:31 localhost podman[274245]: 2026-02-20 09:34:31.785593119 +0000 UTC m=+0.077356395 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:34:31 localhost podman[274245]: 2026-02-20 09:34:31.792418831 +0000 UTC m=+0.084182127 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:34:31 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:34:31 localhost podman[274244]: 2026-02-20 09:34:31.840725136 +0000 UTC m=+0.132361568 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127) Feb 20 04:34:31 localhost podman[274244]: 2026-02-20 09:34:31.901221849 +0000 UTC m=+0.192858241 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:34:31 localhost python3.9[274243]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:31 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:34:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59207 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A057680000000001030307) Feb 20 04:34:32 localhost python3.9[274395]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:33 localhost python3.9[274505]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32282 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A05CE80000000001030307) Feb 20 04:34:34 localhost python3.9[274615]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:34 localhost python3.9[274725]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18816 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A061680000000001030307) Feb 20 04:34:35 localhost python3.9[274835]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:35 localhost python3.9[274945]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:36 localhost nova_compute[230552]: 2026-02-20 09:34:36.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:36 localhost python3.9[275055]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32283 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A06CA80000000001030307) Feb 20 04:34:38 localhost sshd[275073]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:34:41 localhost nova_compute[230552]: 2026-02-20 09:34:41.316 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:41 localhost nova_compute[230552]: 2026-02-20 09:34:41.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:41 localhost nova_compute[230552]: 2026-02-20 09:34:41.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:41 localhost nova_compute[230552]: 2026-02-20 09:34:41.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:41 localhost nova_compute[230552]: 2026-02-20 09:34:41.319 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:34:43 localhost podman[275075]: 2026-02-20 09:34:43.364210409 +0000 UTC m=+0.301516854 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Feb 20 04:34:43 localhost podman[275075]: 2026-02-20 09:34:43.377176251 +0000 UTC m=+0.314482656 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:34:43 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:34:43 localhost python3.9[275186]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Feb 20 04:34:44 localhost sshd[275205]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:34:45 localhost systemd-logind[759]: New session 61 of user zuul. Feb 20 04:34:45 localhost systemd[1]: Started Session 61 of User zuul. Feb 20 04:34:45 localhost systemd[1]: session-61.scope: Deactivated successfully. Feb 20 04:34:45 localhost systemd-logind[759]: Session 61 logged out. Waiting for processes to exit. Feb 20 04:34:45 localhost systemd-logind[759]: Removed session 61. Feb 20 04:34:45 localhost python3.9[275316]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:34:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32284 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A08D690000000001030307) Feb 20 04:34:46 localhost python3.9[275371]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:46 localhost nova_compute[230552]: 2026-02-20 09:34:46.320 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:46 localhost python3.9[275479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:34:47 localhost python3.9[275565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580086.314077-2358-206837844614125/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:47 localhost podman[241968]: time="2026-02-20T09:34:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:34:47 localhost podman[241968]: @ - - [20/Feb/2026:09:34:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 20 04:34:47 localhost podman[241968]: @ - - [20/Feb/2026:09:34:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16786 "" "Go-http-client/1.1" Feb 20 04:34:47 localhost python3.9[275673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:34:48 localhost python3.9[275759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580087.419933-2358-125558617568500/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:48 localhost python3.9[275867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:34:49 localhost python3.9[275953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580088.4403605-2358-212203156006371/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:50 localhost python3.9[276061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:34:51 localhost python3.9[276147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580090.1671562-2520-205150396176672/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=50598ea057afd85a1f5b995974d61e2c257c9737 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:51 localhost nova_compute[230552]: 2026-02-20 09:34:51.322 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:34:52 localhost systemd[1]: tmp-crun.eoincw.mount: Deactivated successfully. Feb 20 04:34:52 localhost podman[276257]: 2026-02-20 09:34:52.203951865 +0000 UTC m=+0.138520430 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:34:52 localhost podman[276257]: 2026-02-20 09:34:52.213941883 +0000 UTC m=+0.148510458 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:34:52 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:34:52 localhost python3.9[276263]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:52 localhost python3.9[276388]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:53 localhost python3.9[276498]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:34:54 localhost python3.9[276610]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:55 localhost python3.9[276718]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:34:56 localhost python3.9[276830]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:56 localhost nova_compute[230552]: 2026-02-20 09:34:56.327 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:56 localhost nova_compute[230552]: 2026-02-20 09:34:56.329 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:34:56 localhost nova_compute[230552]: 2026-02-20 09:34:56.329 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:34:56 localhost nova_compute[230552]: 2026-02-20 09:34:56.329 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:56 localhost nova_compute[230552]: 2026-02-20 09:34:56.370 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:34:56 localhost nova_compute[230552]: 2026-02-20 09:34:56.371 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:34:56 localhost openstack_network_exporter[244414]: ERROR 09:34:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:34:56 localhost openstack_network_exporter[244414]: Feb 20 04:34:56 localhost openstack_network_exporter[244414]: ERROR 09:34:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:34:56 localhost openstack_network_exporter[244414]: Feb 20 04:34:56 localhost python3.9[276940]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:34:57 localhost python3.9[277048]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:35:00 localhost systemd[1]: tmp-crun.FZtexQ.mount: Deactivated successfully. Feb 20 04:35:00 localhost podman[277355]: 2026-02-20 09:35:00.010063741 +0000 UTC m=+0.100336347 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:35:00 localhost podman[277355]: 2026-02-20 09:35:00.017703287 +0000 UTC m=+0.107975883 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:35:00 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:35:00 localhost python3.9[277354]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False Feb 20 04:35:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14677 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0C6260000000001030307) Feb 20 04:35:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:35:00 localhost podman[277433]: 2026-02-20 09:35:00.874222713 +0000 UTC m=+0.087452148 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, release=1770267347, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:35:00 localhost podman[277433]: 2026-02-20 09:35:00.889417883 +0000 UTC m=+0.102647298 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:35:00 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:35:01 localhost python3.9[277506]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.306 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.324 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.325 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.325 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.368 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.371 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.374 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.374 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.374 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.394 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:01 localhost nova_compute[230552]: 2026-02-20 09:35:01.395 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14678 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0CA280000000001030307) Feb 20 04:35:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:35:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:35:02 localhost systemd[1]: tmp-crun.BFztpR.mount: Deactivated successfully. Feb 20 04:35:02 localhost podman[277525]: 2026-02-20 09:35:02.155742174 +0000 UTC m=+0.084337782 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:35:02 localhost podman[277525]: 2026-02-20 09:35:02.164084612 +0000 UTC m=+0.092680250 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Feb 20 04:35:02 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:35:02 localhost podman[277524]: 2026-02-20 09:35:02.261765956 +0000 UTC m=+0.193803800 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:35:02 localhost podman[277524]: 2026-02-20 09:35:02.302219759 +0000 UTC m=+0.234257583 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:35:02 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:35:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32285 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0CD680000000001030307) Feb 20 04:35:03 localhost python3[277657]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:35:03 localhost python3[277657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",#012 "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:31:38.534497001Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1214548351,#012 "VirtualSize": 1214548351,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",#012 "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 20 04:35:03 localhost podman[277710]: 2026-02-20 09:35:03.441358191 +0000 UTC m=+0.081374900 container remove 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=nova_compute_init, container_name=nova_compute_init) Feb 20 04:35:03 localhost python3[277657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute_init Feb 20 04:35:03 localhost podman[277723]: Feb 20 04:35:03 localhost podman[277723]: 2026-02-20 09:35:03.555918558 +0000 UTC m=+0.093047971 container create d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:35:03 localhost podman[277723]: 2026-02-20 09:35:03.512566606 +0000 UTC m=+0.049696049 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 20 04:35:03 localhost python3[277657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Feb 20 04:35:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14679 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0D2280000000001030307) Feb 20 04:35:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59208 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0D5680000000001030307) Feb 20 04:35:04 localhost python3.9[277870]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:35:05 localhost python3.9[277980]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:35:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:35:05.997 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:35:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:35:05.998 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:35:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:35:06.000 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:35:06 localhost nova_compute[230552]: 2026-02-20 09:35:06.395 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:06 localhost nova_compute[230552]: 2026-02-20 09:35:06.397 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:06 localhost nova_compute[230552]: 2026-02-20 09:35:06.397 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:35:06 localhost nova_compute[230552]: 2026-02-20 09:35:06.398 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:06 localhost nova_compute[230552]: 2026-02-20 09:35:06.437 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:06 localhost nova_compute[230552]: 2026-02-20 09:35:06.438 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:07 localhost python3.9[278090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:35:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14680 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0E1E80000000001030307) Feb 20 04:35:07 localhost python3.9[278180]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771580106.6821008-2991-231221965954345/.source.yaml _original_basename=.n49b82ef follow=False checksum=4d557a266f0e30e386f17a3d7c6078d564f9be8b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:08 localhost python3.9[278290]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:09 localhost python3.9[278400]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 20 04:35:10 localhost sshd[278434]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:35:10 localhost python3.9[278512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:35:11 localhost python3.9[278569]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/nova_compute.json _original_basename=.co0ftjja recurse=False state=file path=/var/lib/kolla/config_files/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:11 localhost nova_compute[230552]: 2026-02-20 09:35:11.439 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:11 localhost nova_compute[230552]: 2026-02-20 09:35:11.441 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:11 localhost nova_compute[230552]: 2026-02-20 09:35:11.442 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:35:11 localhost nova_compute[230552]: 2026-02-20 09:35:11.442 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:11 localhost nova_compute[230552]: 2026-02-20 09:35:11.479 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:11 localhost nova_compute[230552]: 2026-02-20 09:35:11.481 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:11 localhost python3.9[278677]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:35:13 localhost podman[278984]: 2026-02-20 09:35:13.99299988 +0000 UTC m=+0.088254894 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 20 04:35:14 localhost podman[278984]: 2026-02-20 09:35:14.032171792 +0000 UTC m=+0.127426836 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 20 04:35:14 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:35:14 localhost python3.9[278983]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False Feb 20 04:35:15 localhost python3.9[279110]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 20 04:35:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14681 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A101680000000001030307) Feb 20 04:35:16 localhost python3[279220]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False Feb 20 04:35:16 localhost python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",#012 "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:31:38.534497001Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1214548351,#012 "VirtualSize": 1214548351,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",#012 "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.435 230556 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.530 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.533 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.533 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5052 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.533 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:16 localhost nova_compute[230552]: 2026-02-20 09:35:16.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:17 localhost podman[241968]: time="2026-02-20T09:35:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:35:17 localhost podman[241968]: @ - - [20/Feb/2026:09:35:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149688 "" "Go-http-client/1.1" Feb 20 04:35:17 localhost podman[241968]: @ - - [20/Feb/2026:09:35:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16661 "" "Go-http-client/1.1" Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.204 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.209 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '117c1219-0ba1-4428-ad43-4deaa8ea6aa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.204937', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77e01b9c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'eb557e1d85b064cd8efc0375bbd9c69b892493731233198ac408b3dcd755d132'}]}, 'timestamp': '2026-02-20 09:35:18.210355', '_unique_id': '54fb405872cd4847a2ba0e5ed9de29aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.213 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcf3083f-728a-491e-95b7-589e5f47273a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.213364', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77e0a6ca-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'dcdfbddae3d80a1452fd5673725190965b96a3d489ef78e0ba06c9bef95de973'}]}, 'timestamp': '2026-02-20 09:35:18.213883', '_unique_id': 'dfb771435e5b4afcbea0744223f7d934'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.216 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8931ba44-6206-4888-b684-b0b3ee2e4ba1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.216024', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77e10ea8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'a4c0cc1012cb8cf8c64464f5ca93ab6be6b6aae5656ab6ac4cc0dffb00ee311e'}]}, 'timestamp': '2026-02-20 09:35:18.216495', '_unique_id': 'c3eca172041c4e6392754a8fa727d56c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3db1e5a3-3b74-4676-b99b-e5c5cdba6b28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.218790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77e6befc-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '9a056927dafaf180cc009b64a3827d6410523d541e6fa66abe151456b5a4c4c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.218790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77e6d59a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '27bf41283f6adcf56dd09ab37f98a59a69e7a1b41f2dcb1b79f03d45c7f1ee40'}]}, 'timestamp': '2026-02-20 09:35:18.254346', '_unique_id': 'fb18afe3c3ca4d09870275748d765997'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b05d704-4195-4c8d-a73c-eb251d4e15ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.257037', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77e97f48-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'f5cd4f0d8eca889a63ea38352ecd204e37be7d85e9c5d63cae70089a7e18b743'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.257037', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77e99212-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'cc17747136debeffb05ece6d8dc4d49c4984c3141d7bd3edace0fe9abdbcfafe'}]}, 'timestamp': '2026-02-20 09:35:18.272263', '_unique_id': 'ddc17cc48f89433999407c5605ca1290'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.274 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.275 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d15c49-a8d1-4719-ae49-6fee83158766', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.274937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77ea0b5c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '9b20040ee81058a50824a3f83b08614ef0232617b656c80df8d41b1d5e7d15a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.274937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77ea1bce-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'b432806ef93ac4936ddd495c6d9b1a8797a6b3d56da2eb1d61bb5b2ad4d1743d'}]}, 'timestamp': '2026-02-20 09:35:18.275817', '_unique_id': 'b26b29b560ed4910b2b85bb8276c9b8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.278 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.297 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 62150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdfd1c40-3a68-4739-859b-68fc1e2eddb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62150000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:35:18.278178', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '77ed7fd0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.536396081, 'message_signature': 'd78580b6c94b57a82c943ab67b7df3729bb1daf004cc25057eb6364c2380068b'}]}, 'timestamp': '2026-02-20 09:35:18.298087', '_unique_id': 'e74492e769c84e6787da2e2572c085a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.300 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95be735f-df03-45b6-a940-192350157131', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.300606', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77edf758-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '305f987621cf9f0db859f4eb600deed25ec8aede171c47d3a2d787af10338e37'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.300606', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77ee0806-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'd7a699ddaa9e1f1a1a01b0c9ae176c1973354ea2e8c1a58c6bb9dca6dd0ff762'}]}, 'timestamp': '2026-02-20 09:35:18.301490', '_unique_id': 'a15f7b89224b45d5aed74c37c2e31841'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.303 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58b9c59e-26a3-441f-a002-4907607ec8ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.303891', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77ee7688-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': '63f512d4be2348f8d5e8c0c95eb16baf1cdb31106e1d1dc1cf7c87bd86ed9539'}]}, 'timestamp': '2026-02-20 09:35:18.304349', '_unique_id': '2cea341da9e945b39fcac3b20d2760f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '381b02b6-c2bd-455f-88d3-7205103941b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.306411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77eed880-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'c4d5319bbad592b32bd670762947fa30d70007779b0cdfd93f88ddc4edd36608'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.306411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77eee9b0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '888cde16890c966f4acebf19247d72725d7e725bef4996ed3b4adcf962d74edc'}]}, 'timestamp': '2026-02-20 09:35:18.307271', '_unique_id': 'b3b7b1cbd5ef416abb060db7fcd5ca5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa80a170-b318-4f40-97ec-1e7c35ad3db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.309517', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77ef535a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'ad4efc634abecb7dddb6b662944e203473f4ed8ff6f1039c08b01fa5be458338'}]}, 'timestamp': '2026-02-20 09:35:18.310003', '_unique_id': '8c264a18bd1648f18b2229eb1b78c7ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.312 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae917ae7-973f-49b1-87dc-c84ba0f4dacc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.312036', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77efb462-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'd7998f5001ab99e7a706ecf54a5e3bcd99126d3a9d3a6328cee672608daed54c'}]}, 'timestamp': '2026-02-20 09:35:18.312485', '_unique_id': 'cb78c4f9e1e6407bbe72a2b2dba6cd04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f70f3cf3-adcd-4fb6-aa5b-ac1f6c0fbda5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.314536', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f0172c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': '94b9b8b7dc6bb3d1ec5e36f3f492de0315e8d88f9c9a7790698f8f37d9a3249c'}]}, 'timestamp': '2026-02-20 09:35:18.315014', '_unique_id': '30803500d497459fa431e91809f598e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1732ae1f-4241-49cc-95e4-d57f8613204d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.317063', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f078de-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': '5add133ed3b0b0f903e128323923d3d762253f43dbf98069d4277613ac6abc05'}]}, 'timestamp': '2026-02-20 09:35:18.317512', '_unique_id': '65b2655cda5d4caf8c16357d81b1e56d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost nova_compute[230552]: 2026-02-20 09:35:18.319 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94679d5a-2470-4a9e-93aa-16fca343d3b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.319516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f0d9be-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'b56181c53f9b03e79dc2611e26c93284bb5c81fff005af32ff2743071aa52bdf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.319516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f0e97c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'baf022f792a6022b137439267b4568624ed5967cd93a0211c95d72b61f986915'}]}, 'timestamp': '2026-02-20 09:35:18.320367', '_unique_id': 'f357daabc311462e9dda56f7df1cf3c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceb88bdf-3c1b-4b66-b820-fd7740e08509', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.322446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f14bf6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '555c569d78587584c3f34c6f9e36713a0e84c748981b89aa256fa576a709e581'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.322446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f15bbe-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '5b5b954600680d718a16f79ec28cbd83696bf402a1caaa3f6d4b8345649f1841'}]}, 'timestamp': '2026-02-20 09:35:18.323292', '_unique_id': 'd3f363064e9945ecbfbe725b005b8313'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '775f73df-0865-4619-ba11-752b2be86046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:35:18.325510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '77f1c3d8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.536396081, 'message_signature': '3b6b4b1cc4f0ebb8befcaaf21b3115f6cc404fc8632ebe3001beaf4fa41b423e'}]}, 'timestamp': '2026-02-20 09:35:18.325971', '_unique_id': '7ce709db5b014e86b45069216a1f3ec8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '336d8d12-4fdf-4bb8-869f-8cdeb595bd65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.327985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f22346-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': '91eb5eddd7a2a488311413aa8cb0731dccae0ec3f66238146904282ebec5a47a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.327985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f23304-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': '090faad3b4274dbaaf963f5b0fa6ae7ca441d85e9a23b94072ad6190c43eaae6'}]}, 'timestamp': '2026-02-20 09:35:18.328855', '_unique_id': 'fa2aca54c2004e178a367c079a70115d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 9437 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25741b18-3767-40bb-b9a4-0804124bfe1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9437, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.330961', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f297b8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'd62f403fd6e9ec16f2bc4a0bbd7759ce870acf4741ed1a2aa3bdb2275bf852f6'}]}, 'timestamp': '2026-02-20 09:35:18.331412', '_unique_id': 'fef8d67b0e3744f59649c9fca5b73afe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.333 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 93 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1728fd36-207a-45fb-9e98-8dc4ac5e68e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 93, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.333329', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f2f14a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'd363b1d494a53ab334bed6e36750a811e8a21c6715664caffd8d3ecab0f55593'}]}, 'timestamp': '2026-02-20 09:35:18.333627', '_unique_id': 'ad708ea6982b4d179f4424c85a023214'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84272e9a-844e-458b-964a-5fa86e9a0d2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.335049', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f334ac-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '4d0b2f2fb1841bcd32ed3397ad71d1febe28e64ebf4195bf1d1003fcc67f96f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.335049', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f33f24-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'cdcf2b4d089afb286366176f125b39b2d16d3eca95b4e34ee159a49b1d9d4124'}]}, 'timestamp': '2026-02-20 09:35:18.335604', '_unique_id': '7b85ec4dbfaf445ab126fc0dfa8f41a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:35:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.323 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.324 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.324 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.324 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.325 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.774 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.829 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:35:19 localhost nova_compute[230552]: 2026-02-20 09:35:19.830 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.018 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.020 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12147MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.021 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.022 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:35:20 localhost podman[279412]: 2026-02-20 09:35:20.063709715 +0000 UTC m=+0.093927009 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=) Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.092 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.093 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.094 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:35:20 localhost podman[279412]: 2026-02-20 09:35:20.125069635 +0000 UTC m=+0.155286909 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.152 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.312 230556 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.315 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.315 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:35:20 localhost nova_compute[230552]: 2026-02-20 09:35:20.316 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:35:20 localhost systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Deactivated successfully. Feb 20 04:35:20 localhost journal[206495]: End of file while reading data: Input/output error Feb 20 04:35:20 localhost systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Consumed 19.442s CPU time. Feb 20 04:35:20 localhost podman[279270]: 2026-02-20 09:35:20.742998833 +0000 UTC m=+4.380518095 container died 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, org.label-schema.schema-version=1.0) Feb 20 04:35:20 localhost systemd[1]: tmp-crun.kJAYWo.mount: Deactivated successfully. Feb 20 04:35:20 localhost podman[279270]: 2026-02-20 09:35:20.879709515 +0000 UTC m=+4.517228807 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:35:20 localhost python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman stop nova_compute Feb 20 04:35:20 localhost podman[279536]: 2026-02-20 09:35:20.894879995 +0000 UTC m=+0.147115775 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=nova_compute, container_name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:35:20 localhost podman[279590]: error opening file `/run/crun/299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782/status`: No such file or directory Feb 20 04:35:21 localhost podman[279562]: 2026-02-20 09:35:21.010072021 +0000 UTC m=+0.092183675 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:35:21 localhost podman[279562]: nova_compute Feb 20 04:35:21 localhost podman[279555]: 2026-02-20 09:35:21.035308832 +0000 UTC m=+0.138271192 container remove 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=nova_compute, container_name=nova_compute) Feb 20 04:35:21 localhost python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Feb 20 04:35:21 localhost systemd[1]: var-lib-containers-storage-overlay-2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1-merged.mount: Deactivated successfully. Feb 20 04:35:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782-userdata-shm.mount: Deactivated successfully. Feb 20 04:35:21 localhost podman[279594]: Error: no container with name or ID "nova_compute" found: no such container Feb 20 04:35:21 localhost systemd[1]: edpm_nova_compute.service: Control process exited, code=exited, status=125/n/a Feb 20 04:35:21 localhost systemd[1]: edpm_nova_compute.service: Failed with result 'exit-code'. Feb 20 04:35:21 localhost podman[279602]: Feb 20 04:35:21 localhost podman[279602]: 2026-02-20 09:35:21.147273218 +0000 UTC m=+0.092017249 container create 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, tcib_managed=true, container_name=nova_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 20 04:35:21 localhost podman[279602]: 2026-02-20 09:35:21.115013629 +0000 UTC m=+0.059757620 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 20 04:35:21 localhost python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Feb 20 04:35:21 localhost systemd[1]: edpm_nova_compute.service: Scheduled restart job, restart counter is at 1. Feb 20 04:35:21 localhost systemd[1]: Stopped nova_compute container. Feb 20 04:35:21 localhost systemd[1]: Starting nova_compute container... Feb 20 04:35:21 localhost systemd[1]: Started libpod-conmon-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope. Feb 20 04:35:21 localhost systemd[1]: Started libcrun container. Feb 20 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:21 localhost podman[279624]: 2026-02-20 09:35:21.291956366 +0000 UTC m=+0.127493987 container init 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:35:21 localhost podman[279624]: 2026-02-20 09:35:21.303483633 +0000 UTC m=+0.139021244 container start 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:35:21 localhost nova_compute[279644]: + sudo -E kolla_set_configs Feb 20 04:35:21 localhost python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman start nova_compute Feb 20 04:35:21 localhost systemd[1]: Started nova_compute container. Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Validating config file Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying service configuration files Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Deleting /etc/ceph Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Creating directory /etc/ceph Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Writing out command to execute Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:21 localhost nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:35:21 localhost nova_compute[279644]: ++ cat /run_command Feb 20 04:35:21 localhost nova_compute[279644]: + CMD=nova-compute Feb 20 04:35:21 localhost nova_compute[279644]: + ARGS= Feb 20 04:35:21 localhost nova_compute[279644]: + sudo kolla_copy_cacerts Feb 20 04:35:21 localhost nova_compute[279644]: + [[ ! -n '' ]] Feb 20 04:35:21 localhost nova_compute[279644]: + . kolla_extend_start Feb 20 04:35:21 localhost nova_compute[279644]: Running command: 'nova-compute' Feb 20 04:35:21 localhost nova_compute[279644]: + echo 'Running command: '\''nova-compute'\''' Feb 20 04:35:21 localhost nova_compute[279644]: + umask 0022 Feb 20 04:35:21 localhost nova_compute[279644]: + exec nova-compute Feb 20 04:35:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:35:22 localhost podman[279831]: 2026-02-20 09:35:22.980134636 +0000 UTC m=+0.073340041 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:35:22 localhost nova_compute[279644]: 2026-02-20 09:35:22.998 279667 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:22.999 279667 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:22.999 279667 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:22.999 279667 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 20 04:35:23 localhost podman[279831]: 2026-02-20 09:35:23.015812451 +0000 UTC m=+0.109017826 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:35:23 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:35:23 localhost python3.9[279830]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.115 279667 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.143 279667 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.143 279667 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.556 279667 INFO nova.virt.driver [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.668 279667 INFO nova.compute.provider_config [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.682 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.682 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.683 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.683 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.683 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console_host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.688 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.688 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.688 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.756 279667 WARNING oslo_config.cfg [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 20 04:35:23 localhost nova_compute[279644]: live_migration_uri is deprecated for removal in favor of two other options that Feb 20 04:35:23 localhost nova_compute[279644]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 20 04:35:23 localhost nova_compute[279644]: and ``live_migration_inbound_addr`` respectively. Feb 20 04:35:23 localhost nova_compute[279644]: ). Its value may be silently ignored in the future.#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_secret_uuid = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.815 279667 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.826 279667 INFO nova.virt.node [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.839 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.841 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.842 279667 INFO nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Connection event '1' reason 'None'#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.846 279667 INFO nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host capabilities Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: f44a30b3-674b-4e65-a07d-fb3d71d4ae11 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: x86_64 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v4 Feb 20 04:35:23 localhost nova_compute[279644]: AMD Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: tcp Feb 20 04:35:23 localhost nova_compute[279644]: rdma Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 16116612 Feb 20 04:35:23 localhost nova_compute[279644]: 4029153 Feb 20 04:35:23 localhost nova_compute[279644]: 0 Feb 20 04:35:23 localhost nova_compute[279644]: 0 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: selinux Feb 20 04:35:23 localhost nova_compute[279644]: 0 Feb 20 04:35:23 localhost nova_compute[279644]: system_u:system_r:svirt_t:s0 Feb 20 04:35:23 localhost nova_compute[279644]: system_u:system_r:svirt_tcg_t:s0 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: dac Feb 20 04:35:23 localhost nova_compute[279644]: 0 Feb 20 04:35:23 localhost nova_compute[279644]: +107:+107 Feb 20 04:35:23 localhost nova_compute[279644]: +107:+107 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: hvm Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 32 Feb 20 04:35:23 localhost nova_compute[279644]: /usr/libexec/qemu-kvm Feb 20 04:35:23 localhost nova_compute[279644]: pc-i440fx-rhel7.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.8.0 Feb 20 04:35:23 localhost nova_compute[279644]: q35 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.4.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.5.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.3.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel7.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.4.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.2.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.2.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.0.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.0.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.1.0 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: hvm Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 64 Feb 20 04:35:23 localhost nova_compute[279644]: /usr/libexec/qemu-kvm Feb 20 04:35:23 localhost nova_compute[279644]: pc-i440fx-rhel7.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.8.0 Feb 20 04:35:23 localhost nova_compute[279644]: q35 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.4.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.5.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.3.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel7.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.4.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.2.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.2.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.0.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.0.0 Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel8.1.0 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: #033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.852 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.854 279667 DEBUG nova.virt.libvirt.volume.mount [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.856 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: /usr/libexec/qemu-kvm Feb 20 04:35:23 localhost nova_compute[279644]: kvm Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.8.0 Feb 20 04:35:23 localhost nova_compute[279644]: i686 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: rom Feb 20 04:35:23 localhost nova_compute[279644]: pflash Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: yes Feb 20 04:35:23 localhost nova_compute[279644]: no Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: no Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: on Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: on Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:23 localhost nova_compute[279644]: AMD Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 486 Feb 20 04:35:23 localhost nova_compute[279644]: 486-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: ClearwaterForest Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: ClearwaterForest-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Conroe Feb 20 04:35:23 localhost nova_compute[279644]: Conroe-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Cooperlake Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cooperlake-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cooperlake-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Dhyana Feb 20 04:35:23 localhost nova_compute[279644]: Dhyana-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Dhyana-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Genoa Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Genoa-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Genoa-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-IBPB Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v4 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v5 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Turin Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Turin-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v1 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v2 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v6 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v7 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: KnightsMill Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: KnightsMill-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G1-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G2 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G2-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G3 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G3-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G4-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G5-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Penryn Feb 20 04:35:23 localhost nova_compute[279644]: Penryn-v1 Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge-v1 Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge-v2 Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Westmere Feb 20 04:35:23 localhost nova_compute[279644]: Westmere-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Westmere-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Westmere-v2 Feb 20 04:35:23 localhost nova_compute[279644]: athlon Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: athlon-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: core2duo Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: core2duo-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: coreduo Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: coreduo-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: kvm32 Feb 20 04:35:23 localhost nova_compute[279644]: kvm32-v1 Feb 20 04:35:23 localhost nova_compute[279644]: kvm64 Feb 20 04:35:23 localhost nova_compute[279644]: kvm64-v1 Feb 20 04:35:23 localhost nova_compute[279644]: n270 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: n270-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: pentium Feb 20 04:35:23 localhost nova_compute[279644]: pentium-v1 Feb 20 04:35:23 localhost nova_compute[279644]: pentium2 Feb 20 04:35:23 localhost nova_compute[279644]: pentium2-v1 Feb 20 04:35:23 localhost nova_compute[279644]: pentium3 Feb 20 04:35:23 localhost nova_compute[279644]: pentium3-v1 Feb 20 04:35:23 localhost nova_compute[279644]: phenom Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: phenom-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: qemu32 Feb 20 04:35:23 localhost nova_compute[279644]: qemu32-v1 Feb 20 04:35:23 localhost nova_compute[279644]: qemu64 Feb 20 04:35:23 localhost nova_compute[279644]: qemu64-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: file Feb 20 04:35:23 localhost nova_compute[279644]: anonymous Feb 20 04:35:23 localhost nova_compute[279644]: memfd Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: disk Feb 20 04:35:23 localhost nova_compute[279644]: cdrom Feb 20 04:35:23 localhost nova_compute[279644]: floppy Feb 20 04:35:23 localhost nova_compute[279644]: lun Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: fdc Feb 20 04:35:23 localhost nova_compute[279644]: scsi Feb 20 04:35:23 localhost nova_compute[279644]: virtio Feb 20 04:35:23 localhost nova_compute[279644]: usb Feb 20 04:35:23 localhost nova_compute[279644]: sata Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: virtio Feb 20 04:35:23 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:23 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: vnc Feb 20 04:35:23 localhost nova_compute[279644]: egl-headless Feb 20 04:35:23 localhost nova_compute[279644]: dbus Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: subsystem Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: default Feb 20 04:35:23 localhost nova_compute[279644]: mandatory Feb 20 04:35:23 localhost nova_compute[279644]: requisite Feb 20 04:35:23 localhost nova_compute[279644]: optional Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: usb Feb 20 04:35:23 localhost nova_compute[279644]: pci Feb 20 04:35:23 localhost nova_compute[279644]: scsi Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: virtio Feb 20 04:35:23 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:23 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: random Feb 20 04:35:23 localhost nova_compute[279644]: egd Feb 20 04:35:23 localhost nova_compute[279644]: builtin Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: path Feb 20 04:35:23 localhost nova_compute[279644]: handle Feb 20 04:35:23 localhost nova_compute[279644]: virtiofs Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: tpm-tis Feb 20 04:35:23 localhost nova_compute[279644]: tpm-crb Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: emulator Feb 20 04:35:23 localhost nova_compute[279644]: external Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 2.0 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: usb Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: pty Feb 20 04:35:23 localhost nova_compute[279644]: unix Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: qemu Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: builtin Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: default Feb 20 04:35:23 localhost nova_compute[279644]: passt Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: isa Feb 20 04:35:23 localhost nova_compute[279644]: hyperv Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: null Feb 20 04:35:23 localhost nova_compute[279644]: vc Feb 20 04:35:23 localhost nova_compute[279644]: pty Feb 20 04:35:23 localhost nova_compute[279644]: dev Feb 20 04:35:23 localhost nova_compute[279644]: file Feb 20 04:35:23 localhost nova_compute[279644]: pipe Feb 20 04:35:23 localhost nova_compute[279644]: stdio Feb 20 04:35:23 localhost nova_compute[279644]: udp Feb 20 04:35:23 localhost nova_compute[279644]: tcp Feb 20 04:35:23 localhost nova_compute[279644]: unix Feb 20 04:35:23 localhost nova_compute[279644]: qemu-vdagent Feb 20 04:35:23 localhost nova_compute[279644]: dbus Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: relaxed Feb 20 04:35:23 localhost nova_compute[279644]: vapic Feb 20 04:35:23 localhost nova_compute[279644]: spinlocks Feb 20 04:35:23 localhost nova_compute[279644]: vpindex Feb 20 04:35:23 localhost nova_compute[279644]: runtime Feb 20 04:35:23 localhost nova_compute[279644]: synic Feb 20 04:35:23 localhost nova_compute[279644]: stimer Feb 20 04:35:23 localhost nova_compute[279644]: reset Feb 20 04:35:23 localhost nova_compute[279644]: vendor_id Feb 20 04:35:23 localhost nova_compute[279644]: frequencies Feb 20 04:35:23 localhost nova_compute[279644]: reenlightenment Feb 20 04:35:23 localhost nova_compute[279644]: tlbflush Feb 20 04:35:23 localhost nova_compute[279644]: ipi Feb 20 04:35:23 localhost nova_compute[279644]: avic Feb 20 04:35:23 localhost nova_compute[279644]: emsr_bitmap Feb 20 04:35:23 localhost nova_compute[279644]: xmm_input Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 4095 Feb 20 04:35:23 localhost nova_compute[279644]: on Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: Linux KVM Hv Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.876 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: /usr/libexec/qemu-kvm Feb 20 04:35:23 localhost nova_compute[279644]: kvm Feb 20 04:35:23 localhost nova_compute[279644]: pc-i440fx-rhel7.6.0 Feb 20 04:35:23 localhost nova_compute[279644]: i686 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: rom Feb 20 04:35:23 localhost nova_compute[279644]: pflash Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: yes Feb 20 04:35:23 localhost nova_compute[279644]: no Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: no Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: on Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: on Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:23 localhost nova_compute[279644]: AMD Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 486 Feb 20 04:35:23 localhost nova_compute[279644]: 486-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Broadwell-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cascadelake-Server-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: ClearwaterForest Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: ClearwaterForest-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Conroe Feb 20 04:35:23 localhost nova_compute[279644]: Conroe-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Cooperlake Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cooperlake-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Cooperlake-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Denverton-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Dhyana Feb 20 04:35:23 localhost nova_compute[279644]: Dhyana-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Dhyana-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Genoa Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Genoa-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Genoa-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-IBPB Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Milan-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v4 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Rome-v5 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Turin Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-Turin-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v1 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v2 Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: EPYC-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: GraniteRapids-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Haswell-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-noTSX Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v6 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Icelake-Server-v7 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: IvyBridge-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: KnightsMill Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: KnightsMill-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Nehalem-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G1-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G2 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G2-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G3 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G3-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G4-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Opteron_G5-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Penryn Feb 20 04:35:23 localhost nova_compute[279644]: Penryn-v1 Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge-v1 Feb 20 04:35:23 localhost nova_compute[279644]: SandyBridge-v2 Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SapphireRapids-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: SierraForest-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Client-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-noTSX-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Skylake-Server-v5 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v2 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v3 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Snowridge-v4 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Westmere Feb 20 04:35:23 localhost nova_compute[279644]: Westmere-IBRS Feb 20 04:35:23 localhost nova_compute[279644]: Westmere-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Westmere-v2 Feb 20 04:35:23 localhost nova_compute[279644]: athlon Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: athlon-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: core2duo Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: core2duo-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: coreduo Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: coreduo-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: kvm32 Feb 20 04:35:23 localhost nova_compute[279644]: kvm32-v1 Feb 20 04:35:23 localhost nova_compute[279644]: kvm64 Feb 20 04:35:23 localhost nova_compute[279644]: kvm64-v1 Feb 20 04:35:23 localhost nova_compute[279644]: n270 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: n270-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: pentium Feb 20 04:35:23 localhost nova_compute[279644]: pentium-v1 Feb 20 04:35:23 localhost nova_compute[279644]: pentium2 Feb 20 04:35:23 localhost nova_compute[279644]: pentium2-v1 Feb 20 04:35:23 localhost nova_compute[279644]: pentium3 Feb 20 04:35:23 localhost nova_compute[279644]: pentium3-v1 Feb 20 04:35:23 localhost nova_compute[279644]: phenom Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: phenom-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: qemu32 Feb 20 04:35:23 localhost nova_compute[279644]: qemu32-v1 Feb 20 04:35:23 localhost nova_compute[279644]: qemu64 Feb 20 04:35:23 localhost nova_compute[279644]: qemu64-v1 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: file Feb 20 04:35:23 localhost nova_compute[279644]: anonymous Feb 20 04:35:23 localhost nova_compute[279644]: memfd Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: disk Feb 20 04:35:23 localhost nova_compute[279644]: cdrom Feb 20 04:35:23 localhost nova_compute[279644]: floppy Feb 20 04:35:23 localhost nova_compute[279644]: lun Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: ide Feb 20 04:35:23 localhost nova_compute[279644]: fdc Feb 20 04:35:23 localhost nova_compute[279644]: scsi Feb 20 04:35:23 localhost nova_compute[279644]: virtio Feb 20 04:35:23 localhost nova_compute[279644]: usb Feb 20 04:35:23 localhost nova_compute[279644]: sata Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: virtio Feb 20 04:35:23 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:23 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: vnc Feb 20 04:35:23 localhost nova_compute[279644]: egl-headless Feb 20 04:35:23 localhost nova_compute[279644]: dbus Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: subsystem Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: default Feb 20 04:35:23 localhost nova_compute[279644]: mandatory Feb 20 04:35:23 localhost nova_compute[279644]: requisite Feb 20 04:35:23 localhost nova_compute[279644]: optional Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: usb Feb 20 04:35:23 localhost nova_compute[279644]: pci Feb 20 04:35:23 localhost nova_compute[279644]: scsi Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: virtio Feb 20 04:35:23 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:23 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: random Feb 20 04:35:23 localhost nova_compute[279644]: egd Feb 20 04:35:23 localhost nova_compute[279644]: builtin Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: path Feb 20 04:35:23 localhost nova_compute[279644]: handle Feb 20 04:35:23 localhost nova_compute[279644]: virtiofs Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: tpm-tis Feb 20 04:35:23 localhost nova_compute[279644]: tpm-crb Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: emulator Feb 20 04:35:23 localhost nova_compute[279644]: external Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 2.0 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: usb Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: pty Feb 20 04:35:23 localhost nova_compute[279644]: unix Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: qemu Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: builtin Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: default Feb 20 04:35:23 localhost nova_compute[279644]: passt Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: isa Feb 20 04:35:23 localhost nova_compute[279644]: hyperv Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: null Feb 20 04:35:23 localhost nova_compute[279644]: vc Feb 20 04:35:23 localhost nova_compute[279644]: pty Feb 20 04:35:23 localhost nova_compute[279644]: dev Feb 20 04:35:23 localhost nova_compute[279644]: file Feb 20 04:35:23 localhost nova_compute[279644]: pipe Feb 20 04:35:23 localhost nova_compute[279644]: stdio Feb 20 04:35:23 localhost nova_compute[279644]: udp Feb 20 04:35:23 localhost nova_compute[279644]: tcp Feb 20 04:35:23 localhost nova_compute[279644]: unix Feb 20 04:35:23 localhost nova_compute[279644]: qemu-vdagent Feb 20 04:35:23 localhost nova_compute[279644]: dbus Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: relaxed Feb 20 04:35:23 localhost nova_compute[279644]: vapic Feb 20 04:35:23 localhost nova_compute[279644]: spinlocks Feb 20 04:35:23 localhost nova_compute[279644]: vpindex Feb 20 04:35:23 localhost nova_compute[279644]: runtime Feb 20 04:35:23 localhost nova_compute[279644]: synic Feb 20 04:35:23 localhost nova_compute[279644]: stimer Feb 20 04:35:23 localhost nova_compute[279644]: reset Feb 20 04:35:23 localhost nova_compute[279644]: vendor_id Feb 20 04:35:23 localhost nova_compute[279644]: frequencies Feb 20 04:35:23 localhost nova_compute[279644]: reenlightenment Feb 20 04:35:23 localhost nova_compute[279644]: tlbflush Feb 20 04:35:23 localhost nova_compute[279644]: ipi Feb 20 04:35:23 localhost nova_compute[279644]: avic Feb 20 04:35:23 localhost nova_compute[279644]: emsr_bitmap Feb 20 04:35:23 localhost nova_compute[279644]: xmm_input Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: 4095 Feb 20 04:35:23 localhost nova_compute[279644]: on Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: off Feb 20 04:35:23 localhost nova_compute[279644]: Linux KVM Hv Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.914 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 20 04:35:23 localhost nova_compute[279644]: 2026-02-20 09:35:23.918 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: /usr/libexec/qemu-kvm Feb 20 04:35:23 localhost nova_compute[279644]: kvm Feb 20 04:35:23 localhost nova_compute[279644]: pc-q35-rhel9.8.0 Feb 20 04:35:23 localhost nova_compute[279644]: x86_64 Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: efi Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 20 04:35:23 localhost nova_compute[279644]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 20 04:35:23 localhost nova_compute[279644]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 20 04:35:23 localhost nova_compute[279644]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: rom Feb 20 04:35:23 localhost nova_compute[279644]: pflash Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: yes Feb 20 04:35:23 localhost nova_compute[279644]: no Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: yes Feb 20 04:35:23 localhost nova_compute[279644]: no Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:23 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: on Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: on Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:24 localhost nova_compute[279644]: AMD Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: 486 Feb 20 04:35:24 localhost nova_compute[279644]: 486-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: ClearwaterForest Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: ClearwaterForest-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost python3.9[279968]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Conroe Feb 20 04:35:24 localhost nova_compute[279644]: Conroe-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Cooperlake Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cooperlake-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cooperlake-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Dhyana Feb 20 04:35:24 localhost nova_compute[279644]: Dhyana-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Dhyana-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Genoa Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Genoa-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Genoa-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-IBPB Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v4 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v5 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Turin Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Turin-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v1 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v2 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v6 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v7 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: KnightsMill Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: KnightsMill-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G1-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G2 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G2-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G3 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G3-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G4-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G5-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Penryn Feb 20 04:35:24 localhost nova_compute[279644]: Penryn-v1 Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge-v1 Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge-v2 Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Westmere Feb 20 04:35:24 localhost nova_compute[279644]: Westmere-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Westmere-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Westmere-v2 Feb 20 04:35:24 localhost nova_compute[279644]: athlon Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: athlon-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: core2duo Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: core2duo-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: coreduo Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: coreduo-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: kvm32 Feb 20 04:35:24 localhost nova_compute[279644]: kvm32-v1 Feb 20 04:35:24 localhost nova_compute[279644]: kvm64 Feb 20 04:35:24 localhost nova_compute[279644]: kvm64-v1 Feb 20 04:35:24 localhost nova_compute[279644]: n270 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: n270-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: pentium Feb 20 04:35:24 localhost nova_compute[279644]: pentium-v1 Feb 20 04:35:24 localhost nova_compute[279644]: pentium2 Feb 20 04:35:24 localhost nova_compute[279644]: pentium2-v1 Feb 20 04:35:24 localhost nova_compute[279644]: pentium3 Feb 20 04:35:24 localhost nova_compute[279644]: pentium3-v1 Feb 20 04:35:24 localhost nova_compute[279644]: phenom Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: phenom-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: qemu32 Feb 20 04:35:24 localhost nova_compute[279644]: qemu32-v1 Feb 20 04:35:24 localhost nova_compute[279644]: qemu64 Feb 20 04:35:24 localhost nova_compute[279644]: qemu64-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: file Feb 20 04:35:24 localhost nova_compute[279644]: anonymous Feb 20 04:35:24 localhost nova_compute[279644]: memfd Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: disk Feb 20 04:35:24 localhost nova_compute[279644]: cdrom Feb 20 04:35:24 localhost nova_compute[279644]: floppy Feb 20 04:35:24 localhost nova_compute[279644]: lun Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: fdc Feb 20 04:35:24 localhost nova_compute[279644]: scsi Feb 20 04:35:24 localhost nova_compute[279644]: virtio Feb 20 04:35:24 localhost nova_compute[279644]: usb Feb 20 04:35:24 localhost nova_compute[279644]: sata Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: virtio Feb 20 04:35:24 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:24 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: vnc Feb 20 04:35:24 localhost nova_compute[279644]: egl-headless Feb 20 04:35:24 localhost nova_compute[279644]: dbus Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: subsystem Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: default Feb 20 04:35:24 localhost nova_compute[279644]: mandatory Feb 20 04:35:24 localhost nova_compute[279644]: requisite Feb 20 04:35:24 localhost nova_compute[279644]: optional Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: usb Feb 20 04:35:24 localhost nova_compute[279644]: pci Feb 20 04:35:24 localhost nova_compute[279644]: scsi Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: virtio Feb 20 04:35:24 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:24 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: random Feb 20 04:35:24 localhost nova_compute[279644]: egd Feb 20 04:35:24 localhost nova_compute[279644]: builtin Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: path Feb 20 04:35:24 localhost nova_compute[279644]: handle Feb 20 04:35:24 localhost nova_compute[279644]: virtiofs Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: tpm-tis Feb 20 04:35:24 localhost nova_compute[279644]: tpm-crb Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: emulator Feb 20 04:35:24 localhost nova_compute[279644]: external Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: 2.0 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: usb Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: pty Feb 20 04:35:24 localhost nova_compute[279644]: unix Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: qemu Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: builtin Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: default Feb 20 04:35:24 localhost nova_compute[279644]: passt Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: isa Feb 20 04:35:24 localhost nova_compute[279644]: hyperv Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: null Feb 20 04:35:24 localhost nova_compute[279644]: vc Feb 20 04:35:24 localhost nova_compute[279644]: pty Feb 20 04:35:24 localhost nova_compute[279644]: dev Feb 20 04:35:24 localhost nova_compute[279644]: file Feb 20 04:35:24 localhost nova_compute[279644]: pipe Feb 20 04:35:24 localhost nova_compute[279644]: stdio Feb 20 04:35:24 localhost nova_compute[279644]: udp Feb 20 04:35:24 localhost nova_compute[279644]: tcp Feb 20 04:35:24 localhost nova_compute[279644]: unix Feb 20 04:35:24 localhost nova_compute[279644]: qemu-vdagent Feb 20 04:35:24 localhost nova_compute[279644]: dbus Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: relaxed Feb 20 04:35:24 localhost nova_compute[279644]: vapic Feb 20 04:35:24 localhost nova_compute[279644]: spinlocks Feb 20 04:35:24 localhost nova_compute[279644]: vpindex Feb 20 04:35:24 localhost nova_compute[279644]: runtime Feb 20 04:35:24 localhost nova_compute[279644]: synic Feb 20 04:35:24 localhost nova_compute[279644]: stimer Feb 20 04:35:24 localhost nova_compute[279644]: reset Feb 20 04:35:24 localhost nova_compute[279644]: vendor_id Feb 20 04:35:24 localhost nova_compute[279644]: frequencies Feb 20 04:35:24 localhost nova_compute[279644]: reenlightenment Feb 20 04:35:24 localhost nova_compute[279644]: tlbflush Feb 20 04:35:24 localhost nova_compute[279644]: ipi Feb 20 04:35:24 localhost nova_compute[279644]: avic Feb 20 04:35:24 localhost nova_compute[279644]: emsr_bitmap Feb 20 04:35:24 localhost nova_compute[279644]: xmm_input Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: 4095 Feb 20 04:35:24 localhost nova_compute[279644]: on Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: Linux KVM Hv Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:23.988 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: /usr/libexec/qemu-kvm Feb 20 04:35:24 localhost nova_compute[279644]: kvm Feb 20 04:35:24 localhost nova_compute[279644]: pc-i440fx-rhel7.6.0 Feb 20 04:35:24 localhost nova_compute[279644]: x86_64 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: rom Feb 20 04:35:24 localhost nova_compute[279644]: pflash Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: yes Feb 20 04:35:24 localhost nova_compute[279644]: no Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: no Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: on Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: on Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:24 localhost nova_compute[279644]: AMD Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: 486 Feb 20 04:35:24 localhost nova_compute[279644]: 486-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Broadwell-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cascadelake-Server-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: ClearwaterForest Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: ClearwaterForest-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Conroe Feb 20 04:35:24 localhost nova_compute[279644]: Conroe-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Cooperlake Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cooperlake-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Cooperlake-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Denverton-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Dhyana Feb 20 04:35:24 localhost nova_compute[279644]: Dhyana-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Dhyana-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Genoa Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Genoa-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Genoa-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-IBPB Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Milan-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v4 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Rome-v5 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Turin Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-Turin-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v1 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v2 Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: EPYC-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: GraniteRapids-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Haswell-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-noTSX Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v6 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Icelake-Server-v7 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: IvyBridge-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: KnightsMill Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: KnightsMill-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Nehalem-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G1-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G2 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G2-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G3 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G3-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G4-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Opteron_G5-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Penryn Feb 20 04:35:24 localhost nova_compute[279644]: Penryn-v1 Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge-v1 Feb 20 04:35:24 localhost nova_compute[279644]: SandyBridge-v2 Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SapphireRapids-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: SierraForest-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Client-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-noTSX-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Skylake-Server-v5 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v2 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v3 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Snowridge-v4 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Westmere Feb 20 04:35:24 localhost nova_compute[279644]: Westmere-IBRS Feb 20 04:35:24 localhost nova_compute[279644]: Westmere-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Westmere-v2 Feb 20 04:35:24 localhost nova_compute[279644]: athlon Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: athlon-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: core2duo Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: core2duo-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: coreduo Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: coreduo-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: kvm32 Feb 20 04:35:24 localhost nova_compute[279644]: kvm32-v1 Feb 20 04:35:24 localhost nova_compute[279644]: kvm64 Feb 20 04:35:24 localhost nova_compute[279644]: kvm64-v1 Feb 20 04:35:24 localhost nova_compute[279644]: n270 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: n270-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: pentium Feb 20 04:35:24 localhost nova_compute[279644]: pentium-v1 Feb 20 04:35:24 localhost nova_compute[279644]: pentium2 Feb 20 04:35:24 localhost nova_compute[279644]: pentium2-v1 Feb 20 04:35:24 localhost nova_compute[279644]: pentium3 Feb 20 04:35:24 localhost nova_compute[279644]: pentium3-v1 Feb 20 04:35:24 localhost nova_compute[279644]: phenom Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: phenom-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: qemu32 Feb 20 04:35:24 localhost nova_compute[279644]: qemu32-v1 Feb 20 04:35:24 localhost nova_compute[279644]: qemu64 Feb 20 04:35:24 localhost nova_compute[279644]: qemu64-v1 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: file Feb 20 04:35:24 localhost nova_compute[279644]: anonymous Feb 20 04:35:24 localhost nova_compute[279644]: memfd Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: disk Feb 20 04:35:24 localhost nova_compute[279644]: cdrom Feb 20 04:35:24 localhost nova_compute[279644]: floppy Feb 20 04:35:24 localhost nova_compute[279644]: lun Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: ide Feb 20 04:35:24 localhost nova_compute[279644]: fdc Feb 20 04:35:24 localhost nova_compute[279644]: scsi Feb 20 04:35:24 localhost nova_compute[279644]: virtio Feb 20 04:35:24 localhost nova_compute[279644]: usb Feb 20 04:35:24 localhost nova_compute[279644]: sata Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: virtio Feb 20 04:35:24 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:24 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: vnc Feb 20 04:35:24 localhost nova_compute[279644]: egl-headless Feb 20 04:35:24 localhost nova_compute[279644]: dbus Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: subsystem Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: default Feb 20 04:35:24 localhost nova_compute[279644]: mandatory Feb 20 04:35:24 localhost nova_compute[279644]: requisite Feb 20 04:35:24 localhost nova_compute[279644]: optional Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: usb Feb 20 04:35:24 localhost nova_compute[279644]: pci Feb 20 04:35:24 localhost nova_compute[279644]: scsi Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: virtio Feb 20 04:35:24 localhost nova_compute[279644]: virtio-transitional Feb 20 04:35:24 localhost nova_compute[279644]: virtio-non-transitional Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: random Feb 20 04:35:24 localhost nova_compute[279644]: egd Feb 20 04:35:24 localhost nova_compute[279644]: builtin Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: path Feb 20 04:35:24 localhost nova_compute[279644]: handle Feb 20 04:35:24 localhost nova_compute[279644]: virtiofs Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: tpm-tis Feb 20 04:35:24 localhost nova_compute[279644]: tpm-crb Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: emulator Feb 20 04:35:24 localhost nova_compute[279644]: external Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: 2.0 Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: usb Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: pty Feb 20 04:35:24 localhost nova_compute[279644]: unix Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: qemu Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: builtin Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: default Feb 20 04:35:24 localhost nova_compute[279644]: passt Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: isa Feb 20 04:35:24 localhost nova_compute[279644]: hyperv Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: null Feb 20 04:35:24 localhost nova_compute[279644]: vc Feb 20 04:35:24 localhost nova_compute[279644]: pty Feb 20 04:35:24 localhost nova_compute[279644]: dev Feb 20 04:35:24 localhost nova_compute[279644]: file Feb 20 04:35:24 localhost nova_compute[279644]: pipe Feb 20 04:35:24 localhost nova_compute[279644]: stdio Feb 20 04:35:24 localhost nova_compute[279644]: udp Feb 20 04:35:24 localhost nova_compute[279644]: tcp Feb 20 04:35:24 localhost nova_compute[279644]: unix Feb 20 04:35:24 localhost nova_compute[279644]: qemu-vdagent Feb 20 04:35:24 localhost nova_compute[279644]: dbus Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: relaxed Feb 20 04:35:24 localhost nova_compute[279644]: vapic Feb 20 04:35:24 localhost nova_compute[279644]: spinlocks Feb 20 04:35:24 localhost nova_compute[279644]: vpindex Feb 20 04:35:24 localhost nova_compute[279644]: runtime Feb 20 04:35:24 localhost nova_compute[279644]: synic Feb 20 04:35:24 localhost nova_compute[279644]: stimer Feb 20 04:35:24 localhost nova_compute[279644]: reset Feb 20 04:35:24 localhost nova_compute[279644]: vendor_id Feb 20 04:35:24 localhost nova_compute[279644]: frequencies Feb 20 04:35:24 localhost nova_compute[279644]: reenlightenment Feb 20 04:35:24 localhost nova_compute[279644]: tlbflush Feb 20 04:35:24 localhost nova_compute[279644]: ipi Feb 20 04:35:24 localhost nova_compute[279644]: avic Feb 20 04:35:24 localhost nova_compute[279644]: emsr_bitmap Feb 20 04:35:24 localhost nova_compute[279644]: xmm_input Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: 4095 Feb 20 04:35:24 localhost nova_compute[279644]: on Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: off Feb 20 04:35:24 localhost nova_compute[279644]: Linux KVM Hv Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: Feb 20 04:35:24 localhost nova_compute[279644]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.055 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.056 279667 INFO nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Secure Boot support detected#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.059 279667 INFO nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.059 279667 INFO nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.073 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.095 279667 INFO nova.virt.node [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.111 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Verified node 41976f9f-3656-482f-8ad0-c81e454a3952 matches my host np0005625204.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.139 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.143 279667 DEBUG nova.virt.libvirt.vif [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005625204.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-02-20T08:23:36Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.144 279667 DEBUG nova.network.os_vif_util [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.145 279667 DEBUG nova.network.os_vif_util [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.145 279667 DEBUG os_vif [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.189 279667 DEBUG ovsdbapp.backend.ovs_idl [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.189 279667 DEBUG ovsdbapp.backend.ovs_idl [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.189 279667 DEBUG ovsdbapp.backend.ovs_idl [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.190 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.191 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.191 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.192 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.193 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.198 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.214 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.214 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.214 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.215 279667 INFO oslo.privsep.daemon [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp7ui3496_/privsep.sock']#033[00m Feb 20 04:35:24 localhost python3.9[280044]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.822 279667 INFO oslo.privsep.daemon [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.701 280119 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.705 280119 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.709 280119 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Feb 20 04:35:24 localhost nova_compute[279644]: 2026-02-20 09:35:24.709 280119 INFO oslo.privsep.daemon [-] privsep daemon running as pid 280119#033[00m Feb 20 04:35:24 localhost python3.9[280158]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771580124.4658573-3314-64964355351845/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.106 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.107 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.109 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.110 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.110 279667 INFO os_vif [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.111 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.114 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.114 279667 INFO nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.222 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.222 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.222 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.223 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.224 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.684 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.754 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.755 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:35:25 localhost python3.9[280236]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.987 279667 WARNING nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.988 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12188MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.989 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:35:25 localhost nova_compute[279644]: 2026-02-20 09:35:25.989 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.134 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.134 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.136 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.196 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.213 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.214 279667 DEBUG nova.compute.provider_tree [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.230 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.248 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SHA,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.288 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.538 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:26 localhost openstack_network_exporter[244414]: ERROR 09:35:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:35:26 localhost openstack_network_exporter[244414]: Feb 20 04:35:26 localhost openstack_network_exporter[244414]: ERROR 09:35:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:35:26 localhost openstack_network_exporter[244414]: Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.793 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.799 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Feb 20 04:35:26 localhost nova_compute[279644]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.800 279667 INFO nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] kernel doesn't support AMD SEV#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.801 279667 DEBUG nova.compute.provider_tree [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.802 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.824 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.849 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.849 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.849 279667 DEBUG nova.service [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.880 279667 DEBUG nova.service [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Feb 20 04:35:26 localhost nova_compute[279644]: 2026-02-20 09:35:26.881 279667 DEBUG nova.servicegroup.drivers.db [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] DB_Driver: join new ServiceGroup member np0005625204.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Feb 20 04:35:27 localhost python3.9[280370]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 20 04:35:28 localhost python3.9[280480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 20 04:35:29 localhost nova_compute[279644]: 2026-02-20 09:35:29.231 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:29 localhost python3.9[280570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771580128.5151887-3437-222993006237039/.source.yaml _original_basename=.teucteh_ follow=False checksum=1398ce19331de48b62372cc81e1a3aaab78c97b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:35:30 localhost podman[280588]: 2026-02-20 09:35:30.154557934 +0000 UTC m=+0.089263357 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:35:30 localhost podman[280588]: 2026-02-20 09:35:30.164272103 +0000 UTC m=+0.098977586 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:35:30 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:35:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16664 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A13B540000000001030307) Feb 20 04:35:30 localhost python3.9[280701]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:35:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:35:31 localhost systemd[1]: tmp-crun.kS4XhE.mount: Deactivated successfully. Feb 20 04:35:31 localhost podman[280719]: 2026-02-20 09:35:31.158493854 +0000 UTC m=+0.094770437 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, architecture=x86_64, build-date=2026-02-05T04:57:10Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9) Feb 20 04:35:31 localhost podman[280719]: 2026-02-20 09:35:31.200313095 +0000 UTC m=+0.136589668 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:35:31 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:35:31 localhost nova_compute[279644]: 2026-02-20 09:35:31.541 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16665 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A13F680000000001030307) Feb 20 04:35:31 localhost python3.9[280830]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:35:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14682 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A141680000000001030307) Feb 20 04:35:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:35:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:35:33 localhost systemd[1]: tmp-crun.zg9zzQ.mount: Deactivated successfully. Feb 20 04:35:33 localhost podman[280939]: 2026-02-20 09:35:33.161968798 +0000 UTC m=+0.097586963 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:35:33 localhost python3.9[280938]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 20 04:35:33 localhost podman[280939]: 2026-02-20 09:35:33.247383425 +0000 UTC m=+0.183001630 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 20 04:35:33 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:35:33 localhost podman[280940]: 2026-02-20 09:35:33.248808189 +0000 UTC m=+0.181439361 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 20 04:35:33 localhost podman[280940]: 2026-02-20 09:35:33.33208623 +0000 UTC m=+0.264717452 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:35:33 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:35:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16666 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A147680000000001030307) Feb 20 04:35:34 localhost systemd[1]: tmp-crun.pPg1UP.mount: Deactivated successfully. Feb 20 04:35:34 localhost nova_compute[279644]: 2026-02-20 09:35:34.234 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:34 localhost python3.9[281091]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 20 04:35:34 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 120.4 (401 of 333 items), suggesting rotation. Feb 20 04:35:34 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:35:34 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:35:34 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:35:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32286 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A14B680000000001030307) Feb 20 04:35:35 localhost sshd[281146]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:35:35 localhost python3.9[281225]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 20 04:35:35 localhost systemd[1]: Stopping nova_compute container... Feb 20 04:35:35 localhost nova_compute[279644]: 2026-02-20 09:35:35.988 279667 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Feb 20 04:35:36 localhost nova_compute[279644]: 2026-02-20 09:35:36.544 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16667 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A157280000000001030307) Feb 20 04:35:38 localhost sshd[281243]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:35:39 localhost nova_compute[279644]: 2026-02-20 09:35:39.237 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:40 localhost nova_compute[279644]: 2026-02-20 09:35:40.100 279667 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 20 04:35:40 localhost nova_compute[279644]: 2026-02-20 09:35:40.102 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:35:40 localhost nova_compute[279644]: 2026-02-20 09:35:40.103 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:35:40 localhost nova_compute[279644]: 2026-02-20 09:35:40.103 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:35:40 localhost journal[206495]: End of file while reading data: Input/output error Feb 20 04:35:40 localhost systemd[1]: libpod-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope: Deactivated successfully. Feb 20 04:35:40 localhost systemd[1]: libpod-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope: Consumed 4.819s CPU time. Feb 20 04:35:40 localhost podman[281229]: 2026-02-20 09:35:40.51155153 +0000 UTC m=+4.604321140 container died 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute) Feb 20 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3-userdata-shm.mount: Deactivated successfully. Feb 20 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace-merged.mount: Deactivated successfully. Feb 20 04:35:40 localhost podman[281229]: 2026-02-20 09:35:40.616693246 +0000 UTC m=+4.709462826 container cleanup 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:35:40 localhost podman[281229]: nova_compute Feb 20 04:35:40 localhost podman[281244]: 2026-02-20 09:35:40.629766639 +0000 UTC m=+0.104917489 container cleanup 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:35:40 localhost systemd[1]: libpod-conmon-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope: Deactivated successfully. Feb 20 04:35:40 localhost podman[281270]: error opening file `/run/crun/4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3/status`: No such file or directory Feb 20 04:35:40 localhost podman[281259]: 2026-02-20 09:35:40.720007014 +0000 UTC m=+0.062772538 container cleanup 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:35:40 localhost podman[281259]: nova_compute Feb 20 04:35:40 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Feb 20 04:35:40 localhost systemd[1]: Stopped nova_compute container. Feb 20 04:35:40 localhost systemd[1]: Starting nova_compute container... Feb 20 04:35:40 localhost systemd[1]: Started libcrun container. Feb 20 04:35:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:40 localhost podman[281274]: 2026-02-20 09:35:40.868209629 +0000 UTC m=+0.111630886 container init 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:35:40 localhost podman[281274]: 2026-02-20 09:35:40.87793502 +0000 UTC m=+0.121356287 container start 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:35:40 localhost podman[281274]: nova_compute Feb 20 04:35:40 localhost nova_compute[281288]: + sudo -E kolla_set_configs Feb 20 04:35:40 localhost systemd[1]: Started nova_compute container. Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Validating config file Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying service configuration files Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /etc/ceph Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Creating directory /etc/ceph Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Writing out command to execute Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:40 localhost nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 20 04:35:40 localhost nova_compute[281288]: ++ cat /run_command Feb 20 04:35:40 localhost nova_compute[281288]: + CMD=nova-compute Feb 20 04:35:40 localhost nova_compute[281288]: + ARGS= Feb 20 04:35:40 localhost nova_compute[281288]: + sudo kolla_copy_cacerts Feb 20 04:35:41 localhost nova_compute[281288]: + [[ ! -n '' ]] Feb 20 04:35:41 localhost nova_compute[281288]: + . kolla_extend_start Feb 20 04:35:41 localhost nova_compute[281288]: Running command: 'nova-compute' Feb 20 04:35:41 localhost nova_compute[281288]: + echo 'Running command: '\''nova-compute'\''' Feb 20 04:35:41 localhost nova_compute[281288]: + umask 0022 Feb 20 04:35:41 localhost nova_compute[281288]: + exec nova-compute Feb 20 04:35:41 localhost python3.9[281409]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 20 04:35:41 localhost systemd[1]: Started libpod-conmon-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4.scope. Feb 20 04:35:42 localhost systemd[1]: Started libcrun container. Feb 20 04:35:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 20 04:35:42 localhost podman[281433]: 2026-02-20 09:35:42.018082855 +0000 UTC m=+0.131884772 container init d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}) Feb 20 04:35:42 localhost podman[281433]: 2026-02-20 09:35:42.027208176 +0000 UTC m=+0.141010093 container start d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Feb 20 04:35:42 localhost python3.9[281409]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Applying nova statedir ownership Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3 already 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3 to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/console.log Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ccf3906461ed5c78e2a6f963756ac32b4b049bce Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ccf3906461ed5c78e2a6f963756ac32b4b049bce Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9 Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/d301d14069645d8c23fee2987984776b3e88a570e1aa96d6cf3e31fa880385fd Feb 20 04:35:42 localhost nova_compute_init[281454]: INFO:nova_statedir:Nova statedir ownership complete Feb 20 04:35:42 localhost systemd[1]: libpod-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4.scope: Deactivated successfully. Feb 20 04:35:42 localhost podman[281455]: 2026-02-20 09:35:42.104790511 +0000 UTC m=+0.059223759 container died d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, config_id=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 20 04:35:42 localhost podman[281466]: 2026-02-20 09:35:42.179579029 +0000 UTC m=+0.077430730 container cleanup d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=nova_compute_init, container_name=nova_compute_init, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:35:42 localhost systemd[1]: libpod-conmon-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4.scope: Deactivated successfully. Feb 20 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay-d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567-merged.mount: Deactivated successfully. Feb 20 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4-userdata-shm.mount: Deactivated successfully. Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.594 281292 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.594 281292 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.595 281292 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.595 281292 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.703 281292 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.713 281292 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:35:42 localhost nova_compute[281288]: 2026-02-20 09:35:42.713 281292 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 20 04:35:42 localhost systemd[1]: session-60.scope: Deactivated successfully. Feb 20 04:35:42 localhost systemd[1]: session-60.scope: Consumed 1min 25.257s CPU time. Feb 20 04:35:42 localhost systemd-logind[759]: Session 60 logged out. Waiting for processes to exit. Feb 20 04:35:42 localhost systemd-logind[759]: Removed session 60. Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.167 281292 INFO nova.virt.driver [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.282 281292 INFO nova.compute.provider_config [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.302 281292 DEBUG oslo_concurrency.lockutils [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.302 281292 DEBUG oslo_concurrency.lockutils [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.302 281292 DEBUG oslo_concurrency.lockutils [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console_host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.309 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.309 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.309 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.312 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.312 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] host = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.312 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] my_block_storage_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] my_ip = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.330 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.330 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.330 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:43 localhost nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 20 04:35:58 localhost nova_compute[281288]: 2026-02-20 09:35:58.964 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:35:59 localhost rsyslogd[758]: imjournal: 8932 messages lost due to rate-limiting (20000 allowed within 600 seconds) Feb 20 04:36:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51779 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1B0840000000001030307) Feb 20 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:36:00 localhost systemd[1]: tmp-crun.ff18gd.mount: Deactivated successfully. Feb 20 04:36:00 localhost podman[281627]: 2026-02-20 09:36:00.873072112 +0000 UTC m=+0.089030029 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:36:00 localhost podman[281627]: 2026-02-20 09:36:00.88404006 +0000 UTC m=+0.099997967 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:36:00 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:36:01 localhost nova_compute[281288]: 2026-02-20 09:36:01.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51780 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1B4A90000000001030307) Feb 20 04:36:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:36:02 localhost podman[281649]: 2026-02-20 09:36:02.128419413 +0000 UTC m=+0.071277791 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:36:02 localhost podman[281649]: 2026-02-20 09:36:02.146264255 +0000 UTC m=+0.089122633 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:36:02 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:36:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16669 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1B7680000000001030307) Feb 20 04:36:02 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:02.573 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:36:02 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:02.577 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:36:02 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:02.579 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:02 localhost nova_compute[281288]: 2026-02-20 09:36:02.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51781 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1BCA90000000001030307) Feb 20 04:36:04 localhost nova_compute[281288]: 2026-02-20 09:36:04.006 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:36:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:36:04 localhost systemd[1]: tmp-crun.5iB1O5.mount: Deactivated successfully. Feb 20 04:36:04 localhost podman[281669]: 2026-02-20 09:36:04.150472651 +0000 UTC m=+0.089743711 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:36:04 localhost systemd[1]: tmp-crun.i7teXq.mount: Deactivated successfully. Feb 20 04:36:04 localhost podman[281670]: 2026-02-20 09:36:04.195614015 +0000 UTC m=+0.130292753 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:36:04 localhost podman[281669]: 2026-02-20 09:36:04.204480999 +0000 UTC m=+0.143752009 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 20 04:36:04 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:36:04 localhost podman[281670]: 2026-02-20 09:36:04.226281051 +0000 UTC m=+0.160959829 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 20 04:36:04 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:36:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14683 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1BF690000000001030307) Feb 20 04:36:05 localhost nova_compute[281288]: 2026-02-20 09:36:05.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:05 localhost nova_compute[281288]: 2026-02-20 09:36:05.764 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 20 04:36:05 localhost nova_compute[281288]: 2026-02-20 09:36:05.765 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:05 localhost nova_compute[281288]: 2026-02-20 09:36:05.765 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:05 localhost nova_compute[281288]: 2026-02-20 09:36:05.766 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:05 localhost nova_compute[281288]: 2026-02-20 09:36:05.822 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:05.999 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:05.999 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:06.001 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:06 localhost nova_compute[281288]: 2026-02-20 09:36:06.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51782 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1CC690000000001030307) Feb 20 04:36:09 localhost nova_compute[281288]: 2026-02-20 09:36:09.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:11 localhost nova_compute[281288]: 2026-02-20 09:36:11.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:14 localhost nova_compute[281288]: 2026-02-20 09:36:14.111 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:36:16 localhost podman[281712]: 2026-02-20 09:36:16.151680952 +0000 UTC m=+0.084478038 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:36:16 localhost podman[281712]: 2026-02-20 09:36:16.163963582 +0000 UTC m=+0.096760658 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:36:16 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:36:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51783 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1ED680000000001030307) Feb 20 04:36:16 localhost nova_compute[281288]: 2026-02-20 09:36:16.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:17 localhost podman[241968]: time="2026-02-20T09:36:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:36:17 localhost nova_compute[281288]: 2026-02-20 09:36:17.702 281292 DEBUG nova.compute.manager [None req-d8ad929c-a450-4219-9ce8-2dd486b6c045 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:17 localhost nova_compute[281288]: 2026-02-20 09:36:17.713 281292 INFO nova.compute.manager [None req-d8ad929c-a450-4219-9ce8-2dd486b6c045 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Retrieving diagnostics#033[00m Feb 20 04:36:17 localhost podman[241968]: @ - - [20/Feb/2026:09:36:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149681 "" "Go-http-client/1.1" Feb 20 04:36:17 localhost podman[241968]: @ - - [20/Feb/2026:09:36:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1" Feb 20 04:36:19 localhost nova_compute[281288]: 2026-02-20 09:36:19.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:21 localhost nova_compute[281288]: 2026-02-20 09:36:21.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:36:23 localhost podman[281819]: 2026-02-20 09:36:23.984937985 +0000 UTC m=+0.079600878 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:36:23 localhost nova_compute[281288]: 2026-02-20 09:36:23.996 281292 DEBUG oslo_concurrency.lockutils [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:23 localhost nova_compute[281288]: 2026-02-20 09:36:23.996 281292 DEBUG oslo_concurrency.lockutils [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:23 localhost podman[281819]: 2026-02-20 09:36:23.997383029 +0000 UTC m=+0.092045972 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:36:23 localhost nova_compute[281288]: 2026-02-20 09:36:23.997 281292 DEBUG nova.compute.manager [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:24 localhost nova_compute[281288]: 2026-02-20 09:36:24.002 281292 DEBUG nova.compute.manager [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Feb 20 04:36:24 localhost nova_compute[281288]: 2026-02-20 09:36:24.006 281292 DEBUG nova.objects.instance [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'flavor' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:24 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:36:24 localhost nova_compute[281288]: 2026-02-20 09:36:24.047 281292 DEBUG nova.virt.libvirt.driver [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Feb 20 04:36:24 localhost nova_compute[281288]: 2026-02-20 09:36:24.174 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:24 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Feb 20 04:36:25 localhost sshd[281842]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:36:26 localhost openstack_network_exporter[244414]: ERROR 09:36:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:36:26 localhost openstack_network_exporter[244414]: Feb 20 04:36:26 localhost openstack_network_exporter[244414]: ERROR 09:36:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:36:26 localhost openstack_network_exporter[244414]: Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:26 localhost kernel: device tape7aa8e2a-27 left promiscuous mode Feb 20 04:36:26 localhost NetworkManager[5988]: [1771580186.8682] device (tape7aa8e2a-27): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00048|binding|INFO|Releasing lport e7aa8e2a-27a6-452b-906c-21cea166b882 from this chassis (sb_readonly=0) Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00049|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 down in Southbound Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.877 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00050|binding|INFO|Removing iface tape7aa8e2a-27 ovn-installed in OVS Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.880 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:26.888 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:ed:d2 192.168.0.140'], port_security=['fa:16:3e:b0:ed:d2 192.168.0.140'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.140/24', 'neutron:device_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005625204.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de929a91-c460-4398-96e0-15a80685a485', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '91bce661d685472eb3e7cacab17bf52a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '571bc6f6-22b1-4aad-9b70-3481475089c6 dd806cfc-5243-4295-bd9f-cfd9f58a9f1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee1d7cd7-5f4f-4b75-a06c-f37c0ef97c77, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e7aa8e2a-27a6-452b-906c-21cea166b882) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:36:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:26.889 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e7aa8e2a-27a6-452b-906c-21cea166b882 in datapath de929a91-c460-4398-96e0-15a80685a485 unbound from our chassis#033[00m Feb 20 04:36:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:26.891 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network de929a91-c460-4398-96e0-15a80685a485, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:36:26 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Feb 20 04:36:26 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 55.907s CPU time. Feb 20 04:36:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:26.896 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4877190f-adc9-49ce-82ab-04a36db91dc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:26.897 162652 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-de929a91-c460-4398-96e0-15a80685a485 namespace which is not needed anymore#033[00m Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00051|ovn_bfd|INFO|Disabled BFD on interface ovn-2df8cc-0 Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-0c414b-0 Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-2275c3-0 Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.899 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:26 localhost systemd-machined[85698]: Machine qemu-1-instance-00000002 terminated. Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00054|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:26 localhost ovn_controller[156798]: 2026-02-20T09:36:26Z|00055|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:36:26 localhost nova_compute[281288]: 2026-02-20 09:36:26.944 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:27 localhost systemd[1]: tmp-crun.D5MwRK.mount: Deactivated successfully. Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.088 281292 DEBUG nova.compute.manager [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-unplugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.089 281292 DEBUG oslo_concurrency.lockutils [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.089 281292 DEBUG oslo_concurrency.lockutils [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.090 281292 DEBUG oslo_concurrency.lockutils [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.091 281292 DEBUG nova.compute.manager [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-unplugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.091 281292 WARNING nova.compute.manager [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-unplugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state active and task_state powering-off.#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.110 281292 INFO nova.virt.libvirt.driver [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance shutdown successfully after 3 seconds.#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.117 281292 INFO nova.virt.libvirt.driver [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance destroyed successfully.#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.118 281292 DEBUG nova.objects.instance [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'numa_topology' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.140 281292 DEBUG nova.compute.manager [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:27 localhost nova_compute[281288]: 2026-02-20 09:36:27.230 281292 DEBUG oslo_concurrency.lockutils [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.121 281292 DEBUG nova.compute.manager [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.122 281292 DEBUG oslo_concurrency.lockutils [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.122 281292 DEBUG oslo_concurrency.lockutils [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.122 281292 DEBUG oslo_concurrency.lockutils [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.123 281292 DEBUG nova.compute.manager [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.123 281292 WARNING nova.compute.manager [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state stopped and task_state None.#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.176 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.527 281292 DEBUG nova.compute.manager [None req-021827a0-cd14-4b02-9789-58184dab09a3 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server [None req-021827a0-cd14-4b02-9789-58184dab09a3 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server raise self.value Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server raise self.value Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 20 04:36:29 localhost nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server #033[00m Feb 20 04:36:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24909 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A225B40000000001030307) Feb 20 04:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:36:31 localhost podman[281894]: 2026-02-20 09:36:31.139214178 +0000 UTC m=+0.077775541 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:36:31 localhost podman[281894]: 2026-02-20 09:36:31.153042035 +0000 UTC m=+0.091603418 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:36:31 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:36:31 localhost nova_compute[281288]: 2026-02-20 09:36:31.613 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24910 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A229A80000000001030307) Feb 20 04:36:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51784 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A22D680000000001030307) Feb 20 04:36:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:36:33 localhost podman[281917]: 2026-02-20 09:36:33.139536895 +0000 UTC m=+0.079813354 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:36:33 localhost podman[281917]: 2026-02-20 09:36:33.151060241 +0000 UTC m=+0.091336690 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64) Feb 20 04:36:33 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:36:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24911 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A231A90000000001030307) Feb 20 04:36:34 localhost nova_compute[281288]: 2026-02-20 09:36:34.211 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16670 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A235680000000001030307) Feb 20 04:36:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:36:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:36:35 localhost podman[281937]: 2026-02-20 09:36:35.142591856 +0000 UTC m=+0.074097077 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true) Feb 20 04:36:35 localhost podman[281938]: 2026-02-20 09:36:35.203077124 +0000 UTC m=+0.130917643 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Feb 20 04:36:35 localhost podman[281938]: 2026-02-20 09:36:35.212127573 +0000 UTC m=+0.139968062 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Feb 20 04:36:35 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:36:35 localhost podman[281937]: 2026-02-20 09:36:35.228027884 +0000 UTC m=+0.159533115 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:36:35 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:36:36 localhost nova_compute[281288]: 2026-02-20 09:36:36.615 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:37 localhost podman[281871]: 2026-02-20 09:36:37.072146909 +0000 UTC m=+10.063554309 container stop 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com) Feb 20 04:36:37 localhost systemd[1]: libpod-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba.scope: Deactivated successfully. Feb 20 04:36:37 localhost podman[281871]: 2026-02-20 09:36:37.107929714 +0000 UTC m=+10.099337114 container died 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 20 04:36:37 localhost systemd[1]: tmp-crun.7Ya6an.mount: Deactivated successfully. Feb 20 04:36:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba-userdata-shm.mount: Deactivated successfully. Feb 20 04:36:37 localhost podman[281871]: 2026-02-20 09:36:37.226184825 +0000 UTC m=+10.217592175 container cleanup 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z) Feb 20 04:36:37 localhost podman[281981]: 2026-02-20 09:36:37.243104017 +0000 UTC m=+0.155573354 container cleanup 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 20 04:36:37 localhost systemd[1]: libpod-conmon-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba.scope: Deactivated successfully. Feb 20 04:36:37 localhost podman[281998]: 2026-02-20 09:36:37.314533312 +0000 UTC m=+0.069792015 container remove 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.319 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6782c0-8516-43e4-a1b7-501455cb25e4]: (4, ('Fri Feb 20 09:36:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485 (57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba)\n57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba\nFri Feb 20 09:36:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485 (57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba)\n57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba\n', 'time="2026-02-20T09:36:37Z" level=warning msg="StopSignal SIGTERM failed to stop container neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485 in 10 seconds, resorting to SIGKILL"\n', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.320 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7fef79dc-5d5b-42ea-961b-bac4ffd2cb6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.321 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde929a91-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:37 localhost kernel: device tapde929a91-c0 left promiscuous mode Feb 20 04:36:37 localhost nova_compute[281288]: 2026-02-20 09:36:37.323 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:37 localhost nova_compute[281288]: 2026-02-20 09:36:37.333 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.335 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ff17f338-5811-4353-8c51-dbc18da58c3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.352 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1567e5cd-e0ce-4d69-8a05-0b8b4d446fcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.353 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a19b93-1b75-40c4-b29d-31b53aace2ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.365 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5e93af60-4cec-47fe-bcc4-c4ae05ad6ab2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637497, 'reachable_time': 16205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282021, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.377 163070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-de929a91-c460-4398-96e0-15a80685a485 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Feb 20 04:36:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:37.378 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bd3bdc-ca23-4af5-8518-79225000d70e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24912 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A241690000000001030307) Feb 20 04:36:38 localhost systemd[1]: var-lib-containers-storage-overlay-7ea5b54d1da71d972d7e8dd243987640d185da35de896817d599cfae85808380-merged.mount: Deactivated successfully. Feb 20 04:36:38 localhost systemd[1]: run-netns-ovnmeta\x2dde929a91\x2dc460\x2d4398\x2d96e0\x2d15a80685a485.mount: Deactivated successfully. Feb 20 04:36:39 localhost nova_compute[281288]: 2026-02-20 09:36:39.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:41 localhost nova_compute[281288]: 2026-02-20 09:36:41.618 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.110 281292 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.111 281292 INFO nova.compute.manager [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] VM Stopped (Lifecycle Event)#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.140 281292 DEBUG nova.compute.manager [None req-11efd309-e505-42a7-ae98-feffabb72bc2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.144 281292 DEBUG nova.compute.manager [None req-11efd309-e505-42a7-ae98-feffabb72bc2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.774 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.776 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:36:42 localhost nova_compute[281288]: 2026-02-20 09:36:42.777 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:36:43 localhost nova_compute[281288]: 2026-02-20 09:36:43.817 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:36:43 localhost nova_compute[281288]: 2026-02-20 09:36:43.818 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:36:43 localhost nova_compute[281288]: 2026-02-20 09:36:43.818 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:36:43 localhost nova_compute[281288]: 2026-02-20 09:36:43.819 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.272 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.308 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.308 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.309 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.310 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.310 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.311 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.311 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.312 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.312 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.313 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.337 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.337 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.338 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.338 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.339 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.797 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.874 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:36:44 localhost nova_compute[281288]: 2026-02-20 09:36:44.875 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.075 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.077 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12615MB free_disk=41.8370475769043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.077 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.078 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.179 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.180 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.180 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.244 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.754 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.761 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.777 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.793 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:36:45 localhost nova_compute[281288]: 2026-02-20 09:36:45.794 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24913 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A261680000000001030307) Feb 20 04:36:46 localhost nova_compute[281288]: 2026-02-20 09:36:46.641 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:36:47 localhost podman[282067]: 2026-02-20 09:36:47.14028233 +0000 UTC m=+0.078580477 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:36:47 localhost podman[282067]: 2026-02-20 09:36:47.173135474 +0000 UTC m=+0.111433651 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:36:47 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:36:47 localhost podman[241968]: time="2026-02-20T09:36:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:36:47 localhost podman[241968]: @ - - [20/Feb/2026:09:36:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147347 "" "Go-http-client/1.1" Feb 20 04:36:47 localhost podman[241968]: @ - - [20/Feb/2026:09:36:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16317 "" "Go-http-client/1.1" Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.063 281292 DEBUG nova.compute.manager [None req-841530b3-780a-4b09-b7a8-bfdd0314cf97 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server [None req-841530b3-780a-4b09-b7a8-bfdd0314cf97 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server raise self.value Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server raise self.value Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 20 04:36:48 localhost nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server #033[00m Feb 20 04:36:49 localhost nova_compute[281288]: 2026-02-20 09:36:49.339 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:50 localhost sshd[282084]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:36:51 localhost nova_compute[281288]: 2026-02-20 09:36:51.691 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:36:54 localhost podman[282086]: 2026-02-20 09:36:54.147840695 +0000 UTC m=+0.083313074 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:36:54 localhost podman[282086]: 2026-02-20 09:36:54.156080199 +0000 UTC m=+0.091552578 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:36:54 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:36:54 localhost nova_compute[281288]: 2026-02-20 09:36:54.385 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:55 localhost nova_compute[281288]: 2026-02-20 09:36:55.931 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'flavor' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:55 localhost nova_compute[281288]: 2026-02-20 09:36:55.950 281292 DEBUG oslo_concurrency.lockutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:36:55 localhost nova_compute[281288]: 2026-02-20 09:36:55.951 281292 DEBUG oslo_concurrency.lockutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:36:55 localhost nova_compute[281288]: 2026-02-20 09:36:55.951 281292 DEBUG nova.network.neutron [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Feb 20 04:36:55 localhost nova_compute[281288]: 2026-02-20 09:36:55.952 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:56 localhost openstack_network_exporter[244414]: ERROR 09:36:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:36:56 localhost openstack_network_exporter[244414]: Feb 20 04:36:56 localhost openstack_network_exporter[244414]: ERROR 09:36:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:36:56 localhost openstack_network_exporter[244414]: Feb 20 04:36:56 localhost nova_compute[281288]: 2026-02-20 09:36:56.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:56 localhost ovn_controller[156798]: 2026-02-20T09:36:56Z|00056|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 20 04:36:57 localhost nova_compute[281288]: 2026-02-20 09:36:57.952 281292 DEBUG nova.network.neutron [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:36:57 localhost nova_compute[281288]: 2026-02-20 09:36:57.970 281292 DEBUG oslo_concurrency.lockutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.004 281292 INFO nova.virt.libvirt.driver [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance destroyed successfully.#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.005 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'numa_topology' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.020 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'resources' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.032 281292 DEBUG nova.virt.libvirt.vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2026-02-20T09:36:27Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.032 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.034 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.034 281292 DEBUG os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.038 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.039 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7aa8e2a-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.043 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.047 281292 INFO os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.050 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.051 281292 INFO nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] UEFI support detected#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.060 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Start _get_guest_xml network_info=[{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=43eca6d8-1b99-4300-a417-76015fcc59e1,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_options': None, 'image_id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}], 'ephemerals': [{'size': 1, 'device_name': '/dev/vdb', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_options': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.065 281292 WARNING nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.067 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.068 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.071 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.071 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.072 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.073 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-20T08:22:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='739ef37c-e459-414b-b65a-355581d54c7c',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=43eca6d8-1b99-4300-a417-76015fcc59e1,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.074 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.074 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.075 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.075 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.076 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.076 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.076 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.077 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.077 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.078 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.078 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'vcpu_model' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.098 281292 DEBUG nova.privsep.utils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.098 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.582 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:36:58 localhost nova_compute[281288]: 2026-02-20 09:36:58.584 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.041 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.043 281292 DEBUG nova.virt.libvirt.vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2026-02-20T09:36:27Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.044 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.045 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.048 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'pci_devices' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.062 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] End _get_guest_xml xml= Feb 20 04:36:59 localhost nova_compute[281288]: f9924957-6cff-426e-9f03-c739820f4ff3 Feb 20 04:36:59 localhost nova_compute[281288]: instance-00000002 Feb 20 04:36:59 localhost nova_compute[281288]: 524288 Feb 20 04:36:59 localhost nova_compute[281288]: 1 Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: test Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:58 Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: 512 Feb 20 04:36:59 localhost nova_compute[281288]: 1 Feb 20 04:36:59 localhost nova_compute[281288]: 0 Feb 20 04:36:59 localhost nova_compute[281288]: 1 Feb 20 04:36:59 localhost nova_compute[281288]: 1 Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: admin Feb 20 04:36:59 localhost nova_compute[281288]: admin Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: RDO Feb 20 04:36:59 localhost nova_compute[281288]: OpenStack Compute Feb 20 04:36:59 localhost nova_compute[281288]: 27.5.2-0.20260127144738.eaa65f0.el9 Feb 20 04:36:59 localhost nova_compute[281288]: f9924957-6cff-426e-9f03-c739820f4ff3 Feb 20 04:36:59 localhost nova_compute[281288]: f9924957-6cff-426e-9f03-c739820f4ff3 Feb 20 04:36:59 localhost nova_compute[281288]: Virtual Machine Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: hvm Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: /dev/urandom Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: Feb 20 04:36:59 localhost nova_compute[281288]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.065 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.065 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.066 281292 DEBUG nova.virt.libvirt.vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2026-02-20T09:36:27Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.067 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.068 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.068 281292 DEBUG os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.069 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.070 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.070 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.073 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.074 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.075 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.106 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.112 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.113 281292 INFO os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')#033[00m Feb 20 04:36:59 localhost systemd[1]: Started libvirt secret daemon. Feb 20 04:36:59 localhost kernel: device tape7aa8e2a-27 entered promiscuous mode Feb 20 04:36:59 localhost NetworkManager[5988]: [1771580219.2357] manager: (tape7aa8e2a-27): new Tun device (/org/freedesktop/NetworkManager/Devices/15) Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00057|binding|INFO|Claiming lport e7aa8e2a-27a6-452b-906c-21cea166b882 for this chassis. Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00058|binding|INFO|e7aa8e2a-27a6-452b-906c-21cea166b882: Claiming fa:16:3e:b0:ed:d2 192.168.0.140 Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.242 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost systemd-udevd[282184]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00059|ovn_bfd|INFO|Enabled BFD on interface ovn-2df8cc-0 Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-0c414b-0 Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-2275c3-0 Feb 20 04:36:59 localhost NetworkManager[5988]: [1771580219.2613] device (tape7aa8e2a-27): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 04:36:59 localhost NetworkManager[5988]: [1771580219.2628] device (tape7aa8e2a-27): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.257 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:ed:d2 192.168.0.140'], port_security=['fa:16:3e:b0:ed:d2 192.168.0.140'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.140/24', 'neutron:device_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de929a91-c460-4398-96e0-15a80685a485', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '91bce661d685472eb3e7cacab17bf52a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '571bc6f6-22b1-4aad-9b70-3481475089c6 dd806cfc-5243-4295-bd9f-cfd9f58a9f1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee1d7cd7-5f4f-4b75-a06c-f37c0ef97c77, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e7aa8e2a-27a6-452b-906c-21cea166b882) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.260 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e7aa8e2a-27a6-452b-906c-21cea166b882 in datapath de929a91-c460-4398-96e0-15a80685a485 bound to our chassis#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.262 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de929a91-c460-4398-96e0-15a80685a485#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.272 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[302c4fce-e22d-4f11-a468-70933805a63c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.274 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapde929a91-c1 in ovnmeta-de929a91-c460-4398-96e0-15a80685a485 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.277 162782 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapde929a91-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.277 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d95a06ea-e086-4143-affd-4f1b67cb81de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.278 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fb255f4c-c6f4-4210-bbca-bc94e43ffad3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.291 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[34d42aa8-f5da-4b86-902d-bd24cd518e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.307 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.312 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1737f3-7107-49f6-8a3d-0e5fa543f8f3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost systemd-machined[85698]: New machine qemu-2-instance-00000002. Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00062|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 ovn-installed in OVS Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00063|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 up in Southbound Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.321 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.322 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.347 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6fb9f0-ad58-4a7c-bc25-2a7057745176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost NetworkManager[5988]: [1771580219.3544] manager: (tapde929a91-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/16) Feb 20 04:36:59 localhost systemd-udevd[282186]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.353 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c97b44ec-9548-4ed3-8a36-db4ffb2ad3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.380 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[c0557cf3-56fb-49fa-92c7-743cc9a4bd81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.383 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac74dfc-540f-4f62-8425-d467f194d280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c1: link becomes ready Feb 20 04:36:59 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c0: link becomes ready Feb 20 04:36:59 localhost NetworkManager[5988]: [1771580219.3988] device (tapde929a91-c0): carrier: link connected Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.404 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[36b9c4ce-9a72-4628-8238-b5dbcf7c9d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.420 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[921bc494-a999-43ed-b33c-41aa47474ebb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde929a91-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:09:c2:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077856, 'reachable_time': 22503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282222, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.434 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5f107aa7-dc2e-46c1-a8fb-3ff869717ed0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:c288'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1077856, 'tstamp': 1077856}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282223, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.450 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c83e9b92-6474-4604-a752-467a8221d44c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde929a91-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:09:c2:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077856, 'reachable_time': 22503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 282224, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.474 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[96c2b696-0977-4f05-9b38-904cc808f002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.530 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[51cfdfdd-717c-402f-bbc5-8f31f57b34d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.532 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde929a91-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.533 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.534 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde929a91-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.536 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost kernel: device tapde929a91-c0 entered promiscuous mode Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.541 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde929a91-c0, col_values=(('external_ids', {'iface-id': '3323e11d-576a-42f3-bcca-e10425268e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.542 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_controller[156798]: 2026-02-20T09:36:59Z|00064|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.553 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.555 162652 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.556 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[54c62514-27bf-4273-a5bb-8712ece7b139]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.558 162652 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: global Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: log /dev/log local0 debug Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: log-tag haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485 Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: user root Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: group root Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: maxconn 1024 Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: pidfile /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: daemon Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: defaults Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: log global Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: mode http Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: option httplog Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: option dontlognull Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: option http-server-close Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: option forwardfor Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: retries 3 Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: timeout http-request 30s Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: timeout connect 30s Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: timeout client 32s Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: timeout server 32s Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: timeout http-keep-alive 30s Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: listen listener Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: bind 169.254.169.254:80 Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: server metadata /var/lib/neutron/metadata_proxy Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: http-request add-header X-OVN-Network-ID de929a91-c460-4398-96e0-15a80685a485 Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Feb 20 04:36:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:36:59.560 162652 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'env', 'PROCESS_TAG=haproxy-de929a91-c460-4398-96e0-15a80685a485', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.678 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.679 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] VM Resumed (Lifecycle Event)#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.683 281292 DEBUG nova.compute.manager [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.686 281292 INFO nova.virt.libvirt.driver [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance rebooted successfully.#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.687 281292 DEBUG nova.compute.manager [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.701 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.710 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.733 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.734 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.734 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] VM Started (Lifecycle Event)#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.773 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.778 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.975 281292 DEBUG nova.compute.manager [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:36:59 localhost podman[282299]: Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.977 281292 DEBUG oslo_concurrency.lockutils [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.978 281292 DEBUG oslo_concurrency.lockutils [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.979 281292 DEBUG oslo_concurrency.lockutils [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.979 281292 DEBUG nova.compute.manager [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:36:59 localhost nova_compute[281288]: 2026-02-20 09:36:59.980 281292 WARNING nova.compute.manager [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state active and task_state None.#033[00m Feb 20 04:36:59 localhost podman[282299]: 2026-02-20 09:36:59.986531948 +0000 UTC m=+0.091907548 container create 9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:37:00 localhost systemd[1]: Started libpod-conmon-9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683.scope. Feb 20 04:37:00 localhost podman[282299]: 2026-02-20 09:36:59.940320011 +0000 UTC m=+0.045695621 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 20 04:37:00 localhost systemd[1]: Started libcrun container. Feb 20 04:37:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf1a97849e0b2cfedd7ec5e1af6ebe76cd044cbaef292fa2ede4328d64f268d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:37:00 localhost podman[282299]: 2026-02-20 09:37:00.094691176 +0000 UTC m=+0.200066776 container init 9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:37:00 localhost podman[282299]: 2026-02-20 09:37:00.104341564 +0000 UTC m=+0.209717184 container start 9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 20 04:37:00 localhost ovn_controller[156798]: 2026-02-20T09:37:00Z|00065|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:37:00 localhost nova_compute[281288]: 2026-02-20 09:37:00.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:00 localhost neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485[282313]: [NOTICE] (282317) : New worker (282319) forked Feb 20 04:37:00 localhost neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485[282313]: [NOTICE] (282317) : Loading success. Feb 20 04:37:00 localhost ovn_controller[156798]: 2026-02-20T09:37:00Z|00066|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:37:00 localhost nova_compute[281288]: 2026-02-20 09:37:00.201 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:00 localhost systemd[1]: tmp-crun.1FGOnR.mount: Deactivated successfully. Feb 20 04:37:00 localhost ovn_controller[156798]: 2026-02-20T09:37:00Z|00067|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:37:00 localhost nova_compute[281288]: 2026-02-20 09:37:00.309 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42373 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A29AE50000000001030307) Feb 20 04:37:00 localhost snmpd[68593]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Feb 20 04:37:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42374 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A29EE80000000001030307) Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.139 281292 DEBUG nova.compute.manager [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG oslo_concurrency.lockutils [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG oslo_concurrency.lockutils [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG oslo_concurrency.lockutils [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG nova.compute.manager [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.141 281292 WARNING nova.compute.manager [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state active and task_state None.#033[00m Feb 20 04:37:02 localhost nova_compute[281288]: 2026-02-20 09:37:02.177 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:37:02 localhost systemd[1]: tmp-crun.T9TTvY.mount: Deactivated successfully. Feb 20 04:37:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24914 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2A1690000000001030307) Feb 20 04:37:02 localhost podman[282328]: 2026-02-20 09:37:02.309901247 +0000 UTC m=+0.105689544 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:37:02 localhost podman[282328]: 2026-02-20 09:37:02.34338379 +0000 UTC m=+0.139172117 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:37:02 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:37:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42375 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2A6E80000000001030307) Feb 20 04:37:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:37:04 localhost nova_compute[281288]: 2026-02-20 09:37:04.106 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:04 localhost podman[282350]: 2026-02-20 09:37:04.144986603 +0000 UTC m=+0.076756750 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, com.redhat.component=ubi9-minimal-container) Feb 20 04:37:04 localhost podman[282350]: 2026-02-20 09:37:04.158963975 +0000 UTC m=+0.090734172 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:37:04 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:37:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51785 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2AB690000000001030307) Feb 20 04:37:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:05.999 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:37:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:06.000 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:37:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:06.001 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:37:06 localhost podman[282371]: 2026-02-20 09:37:06.136274322 +0000 UTC m=+0.072034215 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:37:06 localhost podman[282371]: 2026-02-20 09:37:06.204727884 +0000 UTC m=+0.140487797 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:37:06 localhost systemd[1]: tmp-crun.zi4tqc.mount: Deactivated successfully. Feb 20 04:37:06 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:37:06 localhost podman[282372]: 2026-02-20 09:37:06.21495627 +0000 UTC m=+0.148754742 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:37:06 localhost podman[282372]: 2026-02-20 09:37:06.298401636 +0000 UTC m=+0.232200078 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:37:06 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:37:07 localhost nova_compute[281288]: 2026-02-20 09:37:07.212 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42376 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2B6A90000000001030307) Feb 20 04:37:09 localhost nova_compute[281288]: 2026-02-20 09:37:09.111 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:12 localhost nova_compute[281288]: 2026-02-20 09:37:12.253 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:13 localhost ovn_controller[156798]: 2026-02-20T09:37:13Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:ed:d2 192.168.0.140 Feb 20 04:37:14 localhost nova_compute[281288]: 2026-02-20 09:37:14.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42377 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2D7680000000001030307) Feb 20 04:37:16 localhost sshd[282413]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:37:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:37:17 localhost nova_compute[281288]: 2026-02-20 09:37:17.305 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:17 localhost podman[282415]: 2026-02-20 09:37:17.382828818 +0000 UTC m=+0.135618798 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:37:17 localhost podman[282415]: 2026-02-20 09:37:17.398245684 +0000 UTC m=+0.151035664 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute) Feb 20 04:37:17 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:37:17 localhost podman[241968]: time="2026-02-20T09:37:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:37:17 localhost podman[241968]: @ - - [20/Feb/2026:09:37:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1" Feb 20 04:37:17 localhost podman[241968]: @ - - [20/Feb/2026:09:37:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1" Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.204 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.209 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b09f68d2-2b3e-4ea2-9010-af6c08789365', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.205357', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf66b2d2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '69ae577dc37732c16d83dd9a45628a6b5d6db824b4235bf3f5d5960409113f5c'}]}, 'timestamp': '2026-02-20 09:37:18.210614', '_unique_id': 'eac3e8f193f34cec88e3f49ac9bdddc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.213 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 892382541 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd134d19f-a84a-4724-8a40-b6662f82b50f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 892382541, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.214347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6bc11e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '1d3a6c1d7cd98b614f10d2b3436988bd425b9783a99dabea9646323a036c12d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.214347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6bd87a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '13b831b5c5fb8addfbc3e29122081e6957c3ea77772953d537747ce43f6c0e22'}]}, 'timestamp': '2026-02-20 09:37:18.244289', '_unique_id': '48f7822ecf06487b844a096ad45e0eb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.247 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73c6df0e-69a4-482e-ac49-06b4830e50fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.246976', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6c578c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': 'e5b3fafaf0121dbf61a367e6309d8fbcf49b544f8109d31b5771da09489db7d7'}]}, 'timestamp': '2026-02-20 09:37:18.247555', '_unique_id': '78856f5f3b9b4fd392ec53aa0ad5c589'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.250 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cacb790b-d7f1-4e61-b630-f990bb39a5f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.250050', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6ccf32-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '2eb2049e7ae9163cb2eb310538c42d1e6b60a4c1221ae7a8233cbd2f61eb4690'}]}, 'timestamp': '2026-02-20 09:37:18.250625', '_unique_id': '05a1c8659e144531aa1a3a623e3eb5e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bf73473-caa9-4c25-844c-3e533e305fe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.253006', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6d3f12-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': 'b890c86dc786fcbd11396f4ab51d030b2b04dfd3eb153768c757c9e30e10f7fb'}]}, 'timestamp': '2026-02-20 09:37:18.253565', '_unique_id': '5c0bfaed8a154d268e2ea7c3a596b4cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 221184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e01391c4-49dd-4375-bcd2-2e5d3b7e5b97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221184, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.255910', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6db082-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '587777d95d0bd9b1fe6bcd4bb7e067826e501ffc01ab1f5af56c265e6dc280b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.255910', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6dc0e0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'eacce2afa0a8d90bbcb9fd5cb723134584bb84440c803c3bf02c0fbf3a686ba7'}]}, 'timestamp': '2026-02-20 09:37:18.256880', '_unique_id': '25c3e1ad10094914bdcdfb88ed34773a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.259 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e314a59-d88e-4cc4-b31a-048ef362bb81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.259436', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6e3b42-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '8d435aba753e7e2d7a3de7ed343d483d7470531841b75b9b21855216eade92ee'}]}, 'timestamp': '2026-02-20 09:37:18.260009', '_unique_id': 'afbabf09586b4ad7a173ea8484c1a3d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29305856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4b8250b-cf7b-456d-8814-08ef7136fdf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29305856, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.262315', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6eab7c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'c4c82bb217d1cb4f3d785f52887d138f543d45bd80145e708b2eecaf1fbbda11'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.262315', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6ec062-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '8b83831f1693b8b9ad2664ecc2a519280c59d983d7908361e003ac5e292db3c5'}]}, 'timestamp': '2026-02-20 09:37:18.263318', '_unique_id': '9acf0cd8dea94e468b866fdf1c1d1a1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54c51c36-bdec-47df-b6cb-b881d8c439db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 27, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.265985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6f3d76-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '674e5eceb0b8dd8765a19d294d493a7f1ad8a35b405534caed54ac561b59f5e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.265985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6f5004-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '940afae317d3228b63f57c0486ae7aa21263d9ba4c156ae7b6a0f25bc29b4cbd'}]}, 'timestamp': '2026-02-20 09:37:18.266982', '_unique_id': '65b70e81365f4fe79cdb613696e0afa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aed8ec76-5654-4980-839e-c9b7166cfd42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.269745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf717550-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '46e4ad3327abd9660ae5fac8a90888e743d515559d926bdb5145ddb92f9c6fd0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.269745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf718a68-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': 'e3e84a9b141e6f43792747d905f8816cc81cfd07872473046a36541742332064'}]}, 'timestamp': '2026-02-20 09:37:18.281591', '_unique_id': 'aa62bf2bbe0145ca867b4c6c03686733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1076 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c9c9dba-6467-481e-80ef-3aecaef0d876', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1076, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.284052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf71ff52-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '205bde20c8bd5bb37e226dc2cd571f6d2357a8b84ca3a275683d548ca467290a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.284052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf7210e6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'b8406bbfcd0483f7e3840f90c01a870b168d8792863c5ac151ee895f9d2cc959'}]}, 'timestamp': '2026-02-20 09:37:18.285029', '_unique_id': '47538f61c80e4601bd08f811aa304be1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dbf23db-fcf3-46f1-87d4-9cf291e132fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.287458', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf728274-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '213bf9fea00c420f978499647213120dee793142f990a25766a5cde488fac9a0'}]}, 'timestamp': '2026-02-20 09:37:18.287968', '_unique_id': '1b6197eddf14448caebc18d352ecc20e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46186a6c-86ae-4905-80ca-3bb807fcaa1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 984, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.290267', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf72f1c8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '604212ad23ab53a6fabeb0b830f1ad493012b9e28a79067a1729a8ef99f88bc7'}]}, 'timestamp': '2026-02-20 09:37:18.290848', '_unique_id': '01e826052b3a48e986b4848efc187469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa15539e-4284-477e-971c-b32568911bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.293317', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf7365ae-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '77ca163de9c83df833d8ef2430844f8cbf66b9affd3d250f6314fd42a727a5cb'}]}, 'timestamp': '2026-02-20 09:37:18.293887', '_unique_id': 'f17619b25473442facb078b0fbbd76b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df96f73-0312-49cb-8355-06e254edadbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.296174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf73d4f8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': 'fda9fd0bea97736b2c89da9d6976f8680147a4adc921412b8819454a06d75852'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.296174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf73ea2e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '5d28b18303bf8694bab7ab3875c2bcaf193e40cb386d793e096f40c85fa0305a'}]}, 'timestamp': '2026-02-20 09:37:18.297157', '_unique_id': 'ca1843c42aae4323a27c7c6001cbdd5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.299 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 11780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2897676-5fba-4e7e-9856-6a98f842723d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11780000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:37:18.299592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bf78dcc8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.567730861, 'message_signature': '560e64eab02cc80b5b3a20bf2d1176845bb47b4b00a15ab318bc41d988a7a1d9'}]}, 'timestamp': '2026-02-20 09:37:18.329722', '_unique_id': '1a9e0f2b63f14aa19eafb926cdfed0b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4056858143 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '881fa7f5-2fd0-4467-ad7a-8bb2854e34e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4056858143, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.332154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf79528e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '1ea4ccd640ea6b21f0b4bc6e4afb380aa52e08603604d9d8c807685f223d6855'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.332154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf796af8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'e4742a261dbceea92b91c6e5f9bbe0601de0c228fb4d0da4610897b5a345a527'}]}, 'timestamp': '2026-02-20 09:37:18.333219', '_unique_id': '9c71586a9ebb49319b569882858cfb8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.337 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9eef23b1-15d4-4e21-8707-faee0018b7e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.337640', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf7a2a56-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '28855a85f5d35b0d3411757258f27314e4facb836dcca63c78da506433556787'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.337640', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf7a3eb0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '3b8a400be71d235b1a9f704cfd7b2e3220298478d38d09f7989a38afacf7b08a'}]}, 'timestamp': '2026-02-20 09:37:18.338640', '_unique_id': 'fe02f41e460c427798113e7ba9a715ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.342 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.343 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '329da2f6-9b5e-42df-b931-6f33bfb22a49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 984, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.343220', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf7b21a4-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '8851468cdc21eea6772483fd9487cadb3b8a18bf0a517a10f79c5c0d951a6b88'}]}, 'timestamp': '2026-02-20 09:37:18.344477', '_unique_id': '49c22e944aaa4b56bf752956736ff40d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.347 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '464b5457-d187-4c53-842b-63d04ef03cc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.347711', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf7bb97a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '24f4f7703049e4317fda9bacb6b8e20b32843d460dcd853aa8536df7dc75d822'}]}, 'timestamp': '2026-02-20 09:37:18.348472', '_unique_id': 'a4d3330e4b1f45f8a1ac9bf0146c57bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.351 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 48.83984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b134f5c-d2d7-4db4-9e4f-6a16ba032791', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 48.83984375, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:37:18.351246', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bf7c3df0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.567730861, 'message_signature': 'f531553425573865772dca1805f0269b42cc357036f3a6fcfee73f939ffc7601'}]}, 'timestamp': '2026-02-20 09:37:18.351779', '_unique_id': 'c2689147542548e8b2fa07b72378ab0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:37:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:18.476 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:18.478 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:18 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:18 localhost nova_compute[281288]: 2026-02-20 09:37:18.965 281292 DEBUG nova.compute.manager [None req-05f7fce3-9241-4683-8de9-5db322a59e18 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:37:18 localhost nova_compute[281288]: 2026-02-20 09:37:18.970 281292 INFO nova.compute.manager [None req-05f7fce3-9241-4683-8de9-5db322a59e18 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Retrieving diagnostics#033[00m Feb 20 04:37:19 localhost nova_compute[281288]: 2026-02-20 09:37:19.126 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.848 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.849 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.3714855#033[00m Feb 20 04:37:19 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58326 [20/Feb/2026:09:37:18.475] listener listener/metadata 0/0/0/1374/1374 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.866 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.868 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:19 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58332 [20/Feb/2026:09:37:19.866] listener listener/metadata 0/0/0/29/29 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.896 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0279260#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.910 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.911 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.923 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.923 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0120754#033[00m Feb 20 04:37:19 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58346 [20/Feb/2026:09:37:19.910] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.930 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.931 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.948 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.949 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0181334#033[00m Feb 20 04:37:19 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58362 [20/Feb/2026:09:37:19.930] listener listener/metadata 0/0/0/19/19 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.957 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.958 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.969 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:19 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58370 [20/Feb/2026:09:37:19.957] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.969 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0117185#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.976 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.977 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.989 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:19 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58382 [20/Feb/2026:09:37:19.976] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.989 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 149 time: 0.0121412#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.996 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:19.996 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:19 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.008 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58390 [20/Feb/2026:09:37:19.995] listener listener/metadata 0/0/0/13/13 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.009 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0122588#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.015 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.016 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.030 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.030 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0137398#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58398 [20/Feb/2026:09:37:20.015] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.037 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.038 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58408 [20/Feb/2026:09:37:20.036] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.049 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.050 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0121861#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.056 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.057 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58414 [20/Feb/2026:09:37:20.056] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.070 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0128839#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.083 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.084 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.096 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58426 [20/Feb/2026:09:37:20.083] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.097 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0123501#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.102 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.103 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.121 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.122 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0197082#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58430 [20/Feb/2026:09:37:20.101] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.128 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.129 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.141 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58446 [20/Feb/2026:09:37:20.127] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.141 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0122240#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.147 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.148 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.159 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58460 [20/Feb/2026:09:37:20.147] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.160 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0117393#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.167 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.168 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.179 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58466 [20/Feb/2026:09:37:20.166] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.179 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0114832#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.186 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.187 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Accept: */*#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Connection: close#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Content-Type: text/plain#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: Host: 169.254.169.254#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: User-Agent: curl/7.84.0#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140#015 Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.198 162777 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 20 04:37:20 localhost haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58474 [20/Feb/2026:09:37:20.186] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Feb 20 04:37:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:37:20.199 162777 INFO eventlet.wsgi.server [-] 192.168.0.140, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0113223#033[00m Feb 20 04:37:22 localhost nova_compute[281288]: 2026-02-20 09:37:22.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:24 localhost nova_compute[281288]: 2026-02-20 09:37:24.128 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:37:25 localhost podman[282502]: 2026-02-20 09:37:25.163489756 +0000 UTC m=+0.097437489 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:37:25 localhost podman[282502]: 2026-02-20 09:37:25.172880286 +0000 UTC m=+0.106827979 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:37:25 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:37:26 localhost openstack_network_exporter[244414]: ERROR 09:37:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:37:26 localhost openstack_network_exporter[244414]: Feb 20 04:37:26 localhost openstack_network_exporter[244414]: ERROR 09:37:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:37:26 localhost openstack_network_exporter[244414]: Feb 20 04:37:27 localhost nova_compute[281288]: 2026-02-20 09:37:27.380 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:29 localhost nova_compute[281288]: 2026-02-20 09:37:29.130 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:29 localhost ovn_controller[156798]: 2026-02-20T09:37:29Z|00068|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Feb 20 04:37:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2795 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A310150000000001030307) Feb 20 04:37:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2796 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A314280000000001030307) Feb 20 04:37:32 localhost nova_compute[281288]: 2026-02-20 09:37:32.423 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:37:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42378 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A317680000000001030307) Feb 20 04:37:32 localhost podman[282546]: 2026-02-20 09:37:32.534948662 +0000 UTC m=+0.083368163 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:37:32 localhost podman[282546]: 2026-02-20 09:37:32.568585801 +0000 UTC m=+0.117005282 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:37:32 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:37:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2797 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A31C280000000001030307) Feb 20 04:37:34 localhost nova_compute[281288]: 2026-02-20 09:37:34.132 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24915 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A31F690000000001030307) Feb 20 04:37:34 localhost sshd[282570]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:37:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:37:35 localhost podman[282572]: 2026-02-20 09:37:35.09343099 +0000 UTC m=+0.065810333 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64) Feb 20 04:37:35 localhost podman[282572]: 2026-02-20 09:37:35.109732213 +0000 UTC m=+0.082111586 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter) Feb 20 04:37:35 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:37:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:37:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:37:37 localhost podman[282592]: 2026-02-20 09:37:37.149483067 +0000 UTC m=+0.082719844 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible) Feb 20 04:37:37 localhost podman[282592]: 2026-02-20 09:37:37.189501842 +0000 UTC m=+0.122738589 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Feb 20 04:37:37 localhost podman[282593]: 2026-02-20 09:37:37.201494802 +0000 UTC m=+0.132508821 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:37:37 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:37:37 localhost podman[282593]: 2026-02-20 09:37:37.231616123 +0000 UTC m=+0.162630122 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:37:37 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:37:37 localhost nova_compute[281288]: 2026-02-20 09:37:37.456 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2798 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A32BE90000000001030307) Feb 20 04:37:39 localhost nova_compute[281288]: 2026-02-20 09:37:39.134 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:42 localhost nova_compute[281288]: 2026-02-20 09:37:42.492 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:44 localhost nova_compute[281288]: 2026-02-20 09:37:44.136 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:45 localhost nova_compute[281288]: 2026-02-20 09:37:45.735 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:45 localhost nova_compute[281288]: 2026-02-20 09:37:45.736 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:45 localhost nova_compute[281288]: 2026-02-20 09:37:45.774 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:45 localhost nova_compute[281288]: 2026-02-20 09:37:45.774 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:37:45 localhost nova_compute[281288]: 2026-02-20 09:37:45.774 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:37:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2799 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A34B680000000001030307) Feb 20 04:37:46 localhost nova_compute[281288]: 2026-02-20 09:37:46.824 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:37:46 localhost nova_compute[281288]: 2026-02-20 09:37:46.825 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:37:46 localhost nova_compute[281288]: 2026-02-20 09:37:46.825 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:37:46 localhost nova_compute[281288]: 2026-02-20 09:37:46.826 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:37:47 localhost nova_compute[281288]: 2026-02-20 09:37:47.523 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:47 localhost podman[241968]: time="2026-02-20T09:37:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:37:47 localhost podman[241968]: @ - - [20/Feb/2026:09:37:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1" Feb 20 04:37:47 localhost podman[241968]: @ - - [20/Feb/2026:09:37:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16793 "" "Go-http-client/1.1" Feb 20 04:37:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:37:48 localhost podman[282635]: 2026-02-20 09:37:48.148139881 +0000 UTC m=+0.087375268 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:37:48 localhost podman[282635]: 2026-02-20 09:37:48.159033147 +0000 UTC m=+0.098268534 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 20 04:37:48 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:37:48 localhost snmpd[68593]: empty variable list in _query Feb 20 04:37:48 localhost snmpd[68593]: empty variable list in _query Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.445 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.465 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.466 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.466 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.467 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.467 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.468 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.468 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.469 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.469 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.470 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.486 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.487 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.487 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.487 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.488 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:37:49 localhost nova_compute[281288]: 2026-02-20 09:37:49.966 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.083 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.084 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.324 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.326 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12328MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.326 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.327 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.401 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.401 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.402 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.448 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.913 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:37:50 localhost nova_compute[281288]: 2026-02-20 09:37:50.919 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:37:51 localhost nova_compute[281288]: 2026-02-20 09:37:51.034 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:37:51 localhost nova_compute[281288]: 2026-02-20 09:37:51.170 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:37:51 localhost nova_compute[281288]: 2026-02-20 09:37:51.171 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:37:52 localhost nova_compute[281288]: 2026-02-20 09:37:52.555 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:54 localhost nova_compute[281288]: 2026-02-20 09:37:54.140 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:55 localhost sshd[282699]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:37:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:37:55 localhost podman[282701]: 2026-02-20 09:37:55.923390452 +0000 UTC m=+0.089027519 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:37:55 localhost podman[282701]: 2026-02-20 09:37:55.931055878 +0000 UTC m=+0.096692895 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:37:55 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:37:56 localhost openstack_network_exporter[244414]: ERROR 09:37:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:37:56 localhost openstack_network_exporter[244414]: Feb 20 04:37:56 localhost openstack_network_exporter[244414]: ERROR 09:37:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:37:56 localhost openstack_network_exporter[244414]: Feb 20 04:37:57 localhost nova_compute[281288]: 2026-02-20 09:37:57.564 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:37:59 localhost nova_compute[281288]: 2026-02-20 09:37:59.142 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14433 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A385460000000001030307) Feb 20 04:38:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14434 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A389680000000001030307) Feb 20 04:38:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2800 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A38B680000000001030307) Feb 20 04:38:02 localhost nova_compute[281288]: 2026-02-20 09:38:02.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:38:03 localhost podman[282725]: 2026-02-20 09:38:03.148723888 +0000 UTC m=+0.085748438 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:38:03 localhost podman[282725]: 2026-02-20 09:38:03.159330325 +0000 UTC m=+0.096354845 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:38:03 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:38:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14435 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A391690000000001030307) Feb 20 04:38:04 localhost nova_compute[281288]: 2026-02-20 09:38:04.144 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42379 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A395680000000001030307) Feb 20 04:38:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:38:06.001 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:38:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:38:06.002 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:38:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:38:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:38:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:38:06 localhost systemd[1]: tmp-crun.4eneJZ.mount: Deactivated successfully. Feb 20 04:38:06 localhost podman[282748]: 2026-02-20 09:38:06.149706494 +0000 UTC m=+0.084267692 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.) Feb 20 04:38:06 localhost podman[282748]: 2026-02-20 09:38:06.186205801 +0000 UTC m=+0.120767019 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, release=1770267347, name=ubi9/ubi-minimal) Feb 20 04:38:06 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:38:07 localhost nova_compute[281288]: 2026-02-20 09:38:07.628 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14436 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3A1280000000001030307) Feb 20 04:38:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:38:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:38:08 localhost podman[282768]: 2026-02-20 09:38:08.145669026 +0000 UTC m=+0.085796299 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:38:08 localhost systemd[1]: tmp-crun.qxKaIL.mount: Deactivated successfully. Feb 20 04:38:08 localhost podman[282769]: 2026-02-20 09:38:08.203538003 +0000 UTC m=+0.138092974 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:38:08 localhost podman[282769]: 2026-02-20 09:38:08.210433855 +0000 UTC m=+0.144988826 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:38:08 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:38:08 localhost podman[282768]: 2026-02-20 09:38:08.265132794 +0000 UTC m=+0.205260087 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127) Feb 20 04:38:08 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:38:09 localhost nova_compute[281288]: 2026-02-20 09:38:09.147 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:12 localhost nova_compute[281288]: 2026-02-20 09:38:12.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:14 localhost nova_compute[281288]: 2026-02-20 09:38:14.149 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14437 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3C1680000000001030307) Feb 20 04:38:17 localhost nova_compute[281288]: 2026-02-20 09:38:17.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:17 localhost podman[241968]: time="2026-02-20T09:38:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:38:17 localhost podman[241968]: @ - - [20/Feb/2026:09:38:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1" Feb 20 04:38:17 localhost podman[241968]: @ - - [20/Feb/2026:09:38:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16795 "" "Go-http-client/1.1" Feb 20 04:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:38:19 localhost podman[282811]: 2026-02-20 09:38:19.147502338 +0000 UTC m=+0.068706572 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true) Feb 20 04:38:19 localhost nova_compute[281288]: 2026-02-20 09:38:19.150 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:19 localhost podman[282811]: 2026-02-20 09:38:19.162006786 +0000 UTC m=+0.083211060 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:38:19 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:38:22 localhost nova_compute[281288]: 2026-02-20 09:38:22.708 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:24 localhost nova_compute[281288]: 2026-02-20 09:38:24.151 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:38:26 localhost podman[282830]: 2026-02-20 09:38:26.12424606 +0000 UTC m=+0.065333248 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:38:26 localhost podman[282830]: 2026-02-20 09:38:26.162038826 +0000 UTC m=+0.103125994 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:38:26 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:38:26 localhost openstack_network_exporter[244414]: ERROR 09:38:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:38:26 localhost openstack_network_exporter[244414]: Feb 20 04:38:26 localhost openstack_network_exporter[244414]: ERROR 09:38:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:38:26 localhost openstack_network_exporter[244414]: Feb 20 04:38:27 localhost nova_compute[281288]: 2026-02-20 09:38:27.712 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:29 localhost nova_compute[281288]: 2026-02-20 09:38:29.152 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8531 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3FA750000000001030307) Feb 20 04:38:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8532 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3FE680000000001030307) Feb 20 04:38:31 localhost sshd[282924]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:38:31 localhost systemd-logind[759]: New session 62 of user zuul. Feb 20 04:38:31 localhost systemd[1]: Started Session 62 of User zuul. Feb 20 04:38:32 localhost python3[282964]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:38:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14438 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A401680000000001030307) Feb 20 04:38:32 localhost systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation. Feb 20 04:38:32 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:38:32 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:38:32 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:38:32 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:38:32 localhost subscription-manager[282965]: Unregistered machine with identity: 430a9023-94d5-4ff5-8ad4-f0155783873a Feb 20 04:38:32 localhost nova_compute[281288]: 2026-02-20 09:38:32.733 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8533 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A406680000000001030307) Feb 20 04:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:38:34 localhost nova_compute[281288]: 2026-02-20 09:38:34.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:34 localhost podman[282968]: 2026-02-20 09:38:34.18779137 +0000 UTC m=+0.121363927 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:38:34 localhost podman[282968]: 2026-02-20 09:38:34.199937185 +0000 UTC m=+0.133509783 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:38:34 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:38:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2801 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A409690000000001030307) Feb 20 04:38:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:38:37 localhost podman[282990]: 2026-02-20 09:38:37.144498351 +0000 UTC m=+0.082214670 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc.) Feb 20 04:38:37 localhost podman[282990]: 2026-02-20 09:38:37.157256194 +0000 UTC m=+0.094972583 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter) Feb 20 04:38:37 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:38:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8534 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A416280000000001030307) Feb 20 04:38:37 localhost nova_compute[281288]: 2026-02-20 09:38:37.762 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:38:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:38:39 localhost podman[283010]: 2026-02-20 09:38:39.145707465 +0000 UTC m=+0.083301723 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 20 04:38:39 localhost nova_compute[281288]: 2026-02-20 09:38:39.183 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:39 localhost podman[283011]: 2026-02-20 09:38:39.205537462 +0000 UTC m=+0.140448067 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 20 04:38:39 localhost podman[283011]: 2026-02-20 09:38:39.235438094 +0000 UTC m=+0.170348699 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent) Feb 20 04:38:39 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:38:39 localhost podman[283010]: 2026-02-20 09:38:39.290448473 +0000 UTC m=+0.228042731 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 20 04:38:39 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:38:40 localhost sshd[283057]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:38:42 localhost nova_compute[281288]: 2026-02-20 09:38:42.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:44 localhost nova_compute[281288]: 2026-02-20 09:38:44.212 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8535 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A437680000000001030307) Feb 20 04:38:47 localhost podman[241968]: time="2026-02-20T09:38:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:38:47 localhost podman[241968]: @ - - [20/Feb/2026:09:38:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1" Feb 20 04:38:47 localhost podman[241968]: @ - - [20/Feb/2026:09:38:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16798 "" "Go-http-client/1.1" Feb 20 04:38:47 localhost nova_compute[281288]: 2026-02-20 09:38:47.821 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:49 localhost nova_compute[281288]: 2026-02-20 09:38:49.247 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:38:50 localhost systemd[1]: tmp-crun.rhkjpv.mount: Deactivated successfully. Feb 20 04:38:50 localhost podman[283059]: 2026-02-20 09:38:50.154821031 +0000 UTC m=+0.093419254 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 20 04:38:50 localhost podman[283059]: 2026-02-20 09:38:50.167030298 +0000 UTC m=+0.105628521 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:38:50 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.172 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.174 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.174 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.174 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.589 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.589 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.590 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:38:51 localhost nova_compute[281288]: 2026-02-20 09:38:51.590 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.040 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.055 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.055 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.056 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.056 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.057 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.057 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.058 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.058 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.058 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.059 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.077 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.078 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.078 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.079 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.079 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.572 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.637 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.638 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.855 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.890 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.892 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12331MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.893 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.893 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.954 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.954 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:38:52 localhost nova_compute[281288]: 2026-02-20 09:38:52.955 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:38:53 localhost nova_compute[281288]: 2026-02-20 09:38:53.001 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:38:53 localhost nova_compute[281288]: 2026-02-20 09:38:53.480 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:38:53 localhost nova_compute[281288]: 2026-02-20 09:38:53.486 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:38:53 localhost nova_compute[281288]: 2026-02-20 09:38:53.505 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:38:53 localhost nova_compute[281288]: 2026-02-20 09:38:53.508 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:38:53 localhost nova_compute[281288]: 2026-02-20 09:38:53.508 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:38:54 localhost nova_compute[281288]: 2026-02-20 09:38:54.286 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:56 localhost openstack_network_exporter[244414]: ERROR 09:38:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:38:56 localhost openstack_network_exporter[244414]: Feb 20 04:38:56 localhost openstack_network_exporter[244414]: ERROR 09:38:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:38:56 localhost openstack_network_exporter[244414]: Feb 20 04:38:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:38:57 localhost podman[283124]: 2026-02-20 09:38:57.143246355 +0000 UTC m=+0.080131465 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:38:57 localhost podman[283124]: 2026-02-20 09:38:57.176191181 +0000 UTC m=+0.113076341 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:38:57 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:38:57 localhost nova_compute[281288]: 2026-02-20 09:38:57.857 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:59 localhost nova_compute[281288]: 2026-02-20 09:38:59.288 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:38:59 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 20 04:39:00 localhost sshd[283148]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:39:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=614 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A46FA50000000001030307) Feb 20 04:39:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=615 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A473A80000000001030307) Feb 20 04:39:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8536 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A477680000000001030307) Feb 20 04:39:02 localhost nova_compute[281288]: 2026-02-20 09:39:02.888 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=616 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A47BA90000000001030307) Feb 20 04:39:04 localhost nova_compute[281288]: 2026-02-20 09:39:04.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:39:04 localhost systemd[1]: tmp-crun.UqdEeV.mount: Deactivated successfully. Feb 20 04:39:04 localhost podman[283186]: 2026-02-20 09:39:04.452152781 +0000 UTC m=+0.084696007 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:39:04 localhost podman[283186]: 2026-02-20 09:39:04.489185304 +0000 UTC m=+0.121728580 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:39:04 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:39:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14439 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A47F680000000001030307) Feb 20 04:39:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:39:06.002 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:39:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:39:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:39:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:39:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:39:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=617 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A48B690000000001030307) Feb 20 04:39:07 localhost nova_compute[281288]: 2026-02-20 09:39:07.930 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:39:08 localhost podman[283228]: 2026-02-20 09:39:08.132170407 +0000 UTC m=+0.073399057 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal) Feb 20 04:39:08 localhost podman[283228]: 2026-02-20 09:39:08.145718305 +0000 UTC m=+0.086946965 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vcs-type=git, release=1770267347, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64) Feb 20 04:39:08 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:39:09 localhost nova_compute[281288]: 2026-02-20 09:39:09.377 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:39:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:39:10 localhost podman[283249]: 2026-02-20 09:39:10.14821219 +0000 UTC m=+0.085177350 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:39:10 localhost podman[283248]: 2026-02-20 09:39:10.197360667 +0000 UTC m=+0.136635879 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS) Feb 20 04:39:10 localhost podman[283249]: 2026-02-20 09:39:10.228093325 +0000 UTC m=+0.165058455 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 20 04:39:10 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:39:10 localhost podman[283248]: 2026-02-20 09:39:10.283915378 +0000 UTC m=+0.223190560 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:39:10 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:39:11 localhost sshd[283290]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:39:12 localhost nova_compute[281288]: 2026-02-20 09:39:12.970 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:13 localhost sshd[283292]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:39:13 localhost systemd-logind[759]: New session 63 of user tripleo-admin. Feb 20 04:39:13 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 20 04:39:13 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 20 04:39:13 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 20 04:39:13 localhost systemd[1]: Starting User Manager for UID 1003... Feb 20 04:39:13 localhost systemd[283296]: Queued start job for default target Main User Target. Feb 20 04:39:13 localhost systemd[283296]: Created slice User Application Slice. Feb 20 04:39:13 localhost systemd[283296]: Started Mark boot as successful after the user session has run 2 minutes. Feb 20 04:39:13 localhost systemd[283296]: Started Daily Cleanup of User's Temporary Directories. Feb 20 04:39:13 localhost systemd[283296]: Reached target Paths. Feb 20 04:39:13 localhost systemd[283296]: Reached target Timers. Feb 20 04:39:13 localhost systemd[283296]: Starting D-Bus User Message Bus Socket... Feb 20 04:39:13 localhost systemd[283296]: Starting Create User's Volatile Files and Directories... Feb 20 04:39:13 localhost systemd[283296]: Listening on D-Bus User Message Bus Socket. Feb 20 04:39:13 localhost systemd[283296]: Reached target Sockets. Feb 20 04:39:13 localhost systemd[283296]: Finished Create User's Volatile Files and Directories. Feb 20 04:39:13 localhost systemd[283296]: Reached target Basic System. Feb 20 04:39:13 localhost systemd[283296]: Reached target Main User Target. Feb 20 04:39:13 localhost systemd[283296]: Startup finished in 170ms. Feb 20 04:39:13 localhost systemd[1]: Started User Manager for UID 1003. Feb 20 04:39:13 localhost systemd[1]: Started Session 63 of User tripleo-admin. Feb 20 04:39:14 localhost python3[283439]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:39:14 localhost nova_compute[281288]: 2026-02-20 09:39:14.412 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:15 localhost python3[283583]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 20 04:39:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=618 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A4AB680000000001030307) Feb 20 04:39:16 localhost systemd[1]: Stopping Netfilter Tables... Feb 20 04:39:16 localhost systemd[1]: nftables.service: Deactivated successfully. Feb 20 04:39:16 localhost systemd[1]: Stopped Netfilter Tables. Feb 20 04:39:16 localhost systemd[1]: Starting Netfilter Tables... Feb 20 04:39:16 localhost systemd[1]: Finished Netfilter Tables. Feb 20 04:39:17 localhost sshd[283607]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:39:17 localhost podman[241968]: time="2026-02-20T09:39:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:39:17 localhost podman[241968]: @ - - [20/Feb/2026:09:39:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1" Feb 20 04:39:17 localhost podman[241968]: @ - - [20/Feb/2026:09:39:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16805 "" "Go-http-client/1.1" Feb 20 04:39:18 localhost nova_compute[281288]: 2026-02-20 09:39:18.000 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.206 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.240 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.240 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73b692cd-f415-470c-b4ab-0c771134f529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.207883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f1dc1c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'ee9c663745fe97c61f07363f06a3fc1b3c1e5142efa73d6a16537217fdcabd1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.207883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f1f0c6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '8b841ebbaecf2f042798d138f3c1364011eca809eb6ccbaa972bf27a3f20b2d1'}]}, 'timestamp': '2026-02-20 09:39:18.241317', '_unique_id': 'c54326b00586456c8e358a1f533c17d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 8786 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5329101e-6210-4efe-b4bf-76ee4d2d0fc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 8786, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.244317', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f33486-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '6003d56d6c185069487bdb80ad7b10755df6f422b37ec764c53e4114814247e9'}]}, 'timestamp': '2026-02-20 09:39:18.249671', '_unique_id': '6a9380c231cd41279de995091f2ad8f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f3040e2-1e59-494e-9421-629a8e6d5087', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.252066', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f3a704-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'a92123936a442f2fede5ca3f55899b4490410d6937770069389ae024f6071607'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.252066', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f3bb68-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'c67afcdd1ebc5d5c897cc9cb80bcafc403d222c044ccc743fe4dae802247e580'}]}, 'timestamp': '2026-02-20 09:39:18.253064', '_unique_id': 'e49b8534f8c74c259132de9a5f2e213d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '946aace1-ebca-45a0-9687-a75e5f8d0bd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.255443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f42bac-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'b778305166c57ad0197da68064e340c9c33c9f2d08ec8a42f5f395a1bf11d7f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.255443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f43e58-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '5274ae861fba453ce040f40746a1034e9753d3add6d1a0b69637d9f3f1433612'}]}, 'timestamp': '2026-02-20 09:39:18.256405', '_unique_id': 'de7b27e97d444ceb860c38b488d5c76f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.259 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ec1c70-30a7-49f7-8ba5-002805fcc111', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.258813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f4ae24-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '4d12b331b486fa7af372ab7091afcf36485bfb53abc66711dd616ee2a13306c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.258813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f4c3aa-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '7bd86e9a1ee20c342115a53766b6f4adfc6c58f6d43b36df0c909ebb65a98f60'}]}, 'timestamp': '2026-02-20 09:39:18.259932', '_unique_id': '0ef628fe89e440f99936f904d3b9b311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dce8f2e-87e3-4f54-89f5-4e93b15f29d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.262189', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f531dc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': 'd41ffc208b0f2c4e63c326ad3f9887254326c4c590371fd77678c8b4dfa034c3'}]}, 'timestamp': '2026-02-20 09:39:18.262696', '_unique_id': 'd5ead5902ac245cdb4d54ca57d047978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.264 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f27a502-f0b0-4b10-9df7-19f172aee987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.264767', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f59622-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '8c50742407c66f2910a5218e304c3b42a7e1432626c8e3989db7b01de26d43b1'}]}, 'timestamp': '2026-02-20 09:39:18.265225', '_unique_id': '4ab51598e3ba4120b910b509dbd57cad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '835ab93c-b5df-41c8-b87f-8a986e6d7a6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.267303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f7ad54-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '8c61f5995d17b8eae4c2aa55bc48ea93c479ea0f353ffa3cc08171013c81b39e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.267303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f7bea2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '56309ac1167b2086bf29485462eff2f595f7891a578c9f376afa0ffb3dfd4500'}]}, 'timestamp': '2026-02-20 09:39:18.279337', '_unique_id': '3419b9dcb4ad4b05af743fd18aca0bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.281 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0a9b131-3fca-49fd-8823-bf6405ea8b13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.281953', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f8363e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '527886f55b226962a8f6f893f3425863455ebc0242bfefa32c07d7722ac711c5'}]}, 'timestamp': '2026-02-20 09:39:18.282427', '_unique_id': '0ccadb522f4f495d95ac54b1d10054f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3080cc46-cda0-4c89-ac61-a64ddcc04934', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.284500', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f89a52-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '72d47c827183392c0ff599418df6e70c6bd651408caa34c7d1ee65530c4f5ce9'}]}, 'timestamp': '2026-02-20 09:39:18.284992', '_unique_id': '28cd9e572c454a6f9fc874a130aa6aae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.286 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fbc0af-636a-4d5f-a4cd-133c0bc8932e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.287226', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f90352-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '57a493a0a076a8fc018d7e107ee6cf1bc84a78afe1196b1b8bb64f560c364335'}]}, 'timestamp': '2026-02-20 09:39:18.287707', '_unique_id': 'c42b9ff86dcc40c399127ffe6d0625c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.289 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 12790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b21e0a7-3cfa-4010-9aba-6ba6a144eb75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12790000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:39:18.289925', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '06fc3ed2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.547470831, 'message_signature': '33f55981dadb3f2fded64b016b56856c9630d20152505bc2ec2f86e9cfbe7e45'}]}, 'timestamp': '2026-02-20 09:39:18.308889', '_unique_id': 'c8d0c37a760547418ba4fa0dcd03d7fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.310 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32f3d706-994f-40a8-adfd-256baf74a4d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.310993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06fca3cc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '99be17f8c70d875c931cb6087003622d9bf28a37538943628f1fc31a45002d1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.310993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06fcb330-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '4c402f859b728a6cb5efd6cfe4b3ef8f3693125720825df5871740ea47b390c0'}]}, 'timestamp': '2026-02-20 09:39:18.311843', '_unique_id': 'a2d6f59d4a6b42668ec275e6f733fa27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1dbfd6a-3f8e-47ed-8767-8583cbf952c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.313950', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fd1776-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': 'ed2076580d54a86cfa626750130bc0cad8116a85daffbc2361fe90e53be376db'}]}, 'timestamp': '2026-02-20 09:39:18.314408', '_unique_id': '718d4925cff247719ef1e2ddabbb711c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '477d61d3-97d8-40fc-85e2-f489ac8c1c1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.316447', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fd7a04-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '725a9ba815b87b2fcc493ae427d6ca7c922fb1d948332c5ee879df6f91b9ad59'}]}, 'timestamp': '2026-02-20 09:39:18.316930', '_unique_id': 'cf9fa9ad75e94ee08cfe097fd0fe2275'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.318 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 5839 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab52d0c2-f99d-46c2-81e5-b6e5ffb70612', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 5839, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.318987', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fddbf2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '8f5d57e8aa01914c40219d282109d3c208316498119e5fc8b0fb1ec389978193'}]}, 'timestamp': '2026-02-20 09:39:18.319435', '_unique_id': '0fa6be3f2f76403c932940c9ca671f7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023fb75c-4a44-4687-8ad8-27219f4b4614', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.321468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06fe3ea8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '69e23a18ed303adc8733045c0f78729ac74c8fa0aee5ec510793f2dda2616559'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.321468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06fe4f24-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '95dd703cbe16c0b1e454a0cc2efdc524204101a631b2bce571fef2662238cefe'}]}, 'timestamp': '2026-02-20 09:39:18.322364', '_unique_id': '7dba012e021e4c0a9851e5e524be3222'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.324 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4f6c0c1-0403-4af8-abfb-56556a4f5865', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:39:18.324435', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '06feb1ee-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.547470831, 'message_signature': 'dbe6757ab778d5cf4338acacb65f53c03265d9c094a81297dd13d78028c68914'}]}, 'timestamp': '2026-02-20 09:39:18.324898', '_unique_id': '8c6efea9a48840cfa8b142f922008572'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b7ee831-e59d-4569-9883-b228b1794a7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.326935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06ff1292-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '8265f9c1f66d77397c840668359c9153ddbe18fde9f0fb880d468e8b693b05c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.326935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06ff228c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '2192187047f62b311f84a113c5b48253133321ed0e9fefbc5f43eb6f815dbf3c'}]}, 'timestamp': '2026-02-20 09:39:18.327800', '_unique_id': 'ea31809b1307488da660b807a2c49de2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7eae78f8-dd82-4ec0-98eb-5072d01f4a19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.330023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06ff8ad8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': 'd4010639a29e3fd462dde92af2efb4532351908aeef8a56763602767958b2823'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.330023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06ff9bb8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '52bd550b7187682bf4e57f0ada400cf3865427f0105e70d4673e482f28967c26'}]}, 'timestamp': '2026-02-20 09:39:18.330874', '_unique_id': '36caaf21a082427db728c94b6b1f9c4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.333 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce79320e-5c0a-484d-aeda-cae262a2267d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.332958', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fffdd8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '1a1737010af18add42c883e5a621c61d56f2b00bae9c8650f4cd2a4163becf43'}]}, 'timestamp': '2026-02-20 09:39:18.333410', '_unique_id': 'aa664bb066ff4ea6a1f94afb2a3b2ac2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:39:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Feb 20 04:39:19 localhost nova_compute[281288]: 2026-02-20 09:39:19.442 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:39:21 localhost systemd[1]: tmp-crun.hPnKsn.mount: Deactivated successfully. Feb 20 04:39:21 localhost podman[283664]: 2026-02-20 09:39:21.152805647 +0000 UTC m=+0.085193691 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:39:21 localhost podman[283664]: 2026-02-20 09:39:21.162705993 +0000 UTC m=+0.095094057 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute) Feb 20 04:39:21 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:39:23 localhost nova_compute[281288]: 2026-02-20 09:39:23.003 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:24 localhost nova_compute[281288]: 2026-02-20 09:39:24.479 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:26 localhost openstack_network_exporter[244414]: ERROR 09:39:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:39:26 localhost openstack_network_exporter[244414]: Feb 20 04:39:26 localhost openstack_network_exporter[244414]: ERROR 09:39:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:39:26 localhost openstack_network_exporter[244414]: Feb 20 04:39:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:39:27 localhost systemd[1]: tmp-crun.sMSXpq.mount: Deactivated successfully. Feb 20 04:39:27 localhost podman[283789]: 2026-02-20 09:39:27.409531954 +0000 UTC m=+0.103238868 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:39:27 localhost podman[283789]: 2026-02-20 09:39:27.419203732 +0000 UTC m=+0.112910676 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:39:27 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:39:28 localhost nova_compute[281288]: 2026-02-20 09:39:28.006 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:28 localhost podman[283876]: Feb 20 04:39:28 localhost podman[283876]: 2026-02-20 09:39:28.036764796 +0000 UTC m=+0.083933072 container create e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, name=rhceph, RELEASE=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:39:28 localhost systemd[1]: Started libpod-conmon-e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771.scope. Feb 20 04:39:28 localhost podman[283876]: 2026-02-20 09:39:28.005347856 +0000 UTC m=+0.052516202 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:39:28 localhost systemd[1]: Started libcrun container. Feb 20 04:39:28 localhost podman[283876]: 2026-02-20 09:39:28.135357309 +0000 UTC m=+0.182525585 container init e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:39:28 localhost podman[283876]: 2026-02-20 09:39:28.146830124 +0000 UTC m=+0.193998410 container start e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, architecture=x86_64, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, vcs-type=git, GIT_CLEAN=True, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:39:28 localhost podman[283876]: 2026-02-20 09:39:28.147672599 +0000 UTC m=+0.194840935 container attach e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, build-date=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:39:28 localhost vibrant_jemison[283891]: 167 167 Feb 20 04:39:28 localhost systemd[1]: libpod-e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771.scope: Deactivated successfully. Feb 20 04:39:28 localhost podman[283876]: 2026-02-20 09:39:28.152199309 +0000 UTC m=+0.199367615 container died e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347) Feb 20 04:39:28 localhost podman[283896]: 2026-02-20 09:39:28.249181852 +0000 UTC m=+0.083104835 container remove e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=1770267347, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph) Feb 20 04:39:28 localhost systemd[1]: libpod-conmon-e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771.scope: Deactivated successfully. Feb 20 04:39:28 localhost systemd[1]: Reloading. Feb 20 04:39:28 localhost systemd-rc-local-generator[283934]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:39:28 localhost systemd-sysv-generator[283940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: var-lib-containers-storage-overlay-e7762a87729e6faf8de3f77f15b2bfe28c40ae41738f2fe65325dcca7d8c2ca4-merged.mount: Deactivated successfully. Feb 20 04:39:28 localhost systemd[1]: Reloading. Feb 20 04:39:28 localhost systemd-sysv-generator[283982]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:39:28 localhost systemd-rc-local-generator[283979]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:39:29 localhost systemd[1]: Starting Ceph mds.mds.np0005625204.wnsphl for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 04:39:29 localhost podman[284043]: Feb 20 04:39:29 localhost podman[284043]: 2026-02-20 09:39:29.406353113 +0000 UTC m=+0.078513955 container create f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7) Feb 20 04:39:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:39:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 04:39:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 04:39:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/var/lib/ceph/mds/ceph-mds.np0005625204.wnsphl supports timestamps until 2038 (0x7fffffff) Feb 20 04:39:29 localhost podman[284043]: 2026-02-20 09:39:29.456447859 +0000 UTC m=+0.128608691 container init f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, build-date=2026-02-09T10:25:24Z) Feb 20 04:39:29 localhost podman[284043]: 2026-02-20 09:39:29.467176941 +0000 UTC m=+0.139337773 container start f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z) Feb 20 04:39:29 localhost bash[284043]: f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 Feb 20 04:39:29 localhost podman[284043]: 2026-02-20 09:39:29.371486947 +0000 UTC m=+0.043647799 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:39:29 localhost systemd[1]: Started Ceph mds.mds.np0005625204.wnsphl for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 04:39:29 localhost ceph-mds[284061]: set uid:gid to 167:167 (ceph:ceph) Feb 20 04:39:29 localhost ceph-mds[284061]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mds, pid 2 Feb 20 04:39:29 localhost ceph-mds[284061]: main not setting numa affinity Feb 20 04:39:29 localhost ceph-mds[284061]: pidfile_write: ignore empty --pid-file Feb 20 04:39:29 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl[284057]: starting mds.mds.np0005625204.wnsphl at Feb 20 04:39:29 localhost ceph-mds[284061]: mds.mds.np0005625204.wnsphl Updating MDS map to version 7 from mon.1 Feb 20 04:39:29 localhost nova_compute[281288]: 2026-02-20 09:39:29.533 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:30 localhost ceph-mds[284061]: mds.mds.np0005625204.wnsphl Updating MDS map to version 8 from mon.1 Feb 20 04:39:30 localhost ceph-mds[284061]: mds.mds.np0005625204.wnsphl Monitors have assigned me to become a standby. Feb 20 04:39:32 localhost systemd[1]: session-62.scope: Deactivated successfully. Feb 20 04:39:32 localhost systemd-logind[759]: Session 62 logged out. Waiting for processes to exit. Feb 20 04:39:32 localhost systemd-logind[759]: Removed session 62. Feb 20 04:39:33 localhost nova_compute[281288]: 2026-02-20 09:39:33.008 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:34 localhost nova_compute[281288]: 2026-02-20 09:39:34.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:39:34 localhost systemd[1]: tmp-crun.fmLnjx.mount: Deactivated successfully. Feb 20 04:39:34 localhost podman[284099]: 2026-02-20 09:39:34.902290455 +0000 UTC m=+0.088667388 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:39:34 localhost podman[284099]: 2026-02-20 09:39:34.911167569 +0000 UTC m=+0.097544512 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:39:34 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:39:35 localhost podman[284229]: 2026-02-20 09:39:35.818241789 +0000 UTC m=+0.085347425 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, maintainer=Guillaume Abrioux , release=1770267347, name=rhceph, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph) Feb 20 04:39:35 localhost podman[284229]: 2026-02-20 09:39:35.905989288 +0000 UTC m=+0.173094914 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , release=1770267347, vcs-type=git, ceph=True, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.component=rhceph-container) Feb 20 04:39:38 localhost nova_compute[281288]: 2026-02-20 09:39:38.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:39:38 localhost podman[284419]: 2026-02-20 09:39:38.484543194 +0000 UTC m=+0.070613290 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1770267347, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.tags=minimal rhel9) Feb 20 04:39:38 localhost podman[284419]: 2026-02-20 09:39:38.523039783 +0000 UTC m=+0.109109859 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, release=1770267347, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 20 04:39:38 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:39:39 localhost nova_compute[281288]: 2026-02-20 09:39:39.616 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:39:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:39:41 localhost podman[284439]: 2026-02-20 09:39:41.151797859 +0000 UTC m=+0.088472462 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:39:41 localhost podman[284439]: 2026-02-20 09:39:41.187434599 +0000 UTC m=+0.124109192 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:39:41 localhost podman[284440]: 2026-02-20 09:39:41.19913588 +0000 UTC m=+0.134534384 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:39:41 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:39:41 localhost podman[284440]: 2026-02-20 09:39:41.232151819 +0000 UTC m=+0.167550283 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:39:41 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:39:43 localhost nova_compute[281288]: 2026-02-20 09:39:43.864 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:44 localhost nova_compute[281288]: 2026-02-20 09:39:44.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:39:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4979 writes, 22K keys, 4979 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4979 writes, 657 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 40 writes, 107 keys, 40 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s#012Interval WAL: 40 writes, 20 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:39:47 localhost podman[241968]: time="2026-02-20T09:39:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:39:47 localhost podman[241968]: @ - - [20/Feb/2026:09:39:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150740 "" "Go-http-client/1.1" Feb 20 04:39:47 localhost podman[241968]: @ - - [20/Feb/2026:09:39:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17284 "" "Go-http-client/1.1" Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.053 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.053 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.077 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.078 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.078 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.871 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.913 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.914 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.914 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:39:48 localhost nova_compute[281288]: 2026-02-20 09:39:48.915 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.379 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.394 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.394 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.395 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.396 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.396 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.397 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.397 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.398 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.398 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.399 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.414 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.415 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.416 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.416 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.417 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.672 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.870 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.949 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:39:49 localhost nova_compute[281288]: 2026-02-20 09:39:49.950 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.174 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.176 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12305MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.176 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.177 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.257 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.257 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.257 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.292 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.723 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.730 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.759 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.761 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:39:50 localhost nova_compute[281288]: 2026-02-20 09:39:50.762 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:39:51 localhost sshd[284526]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:39:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5820 writes, 25K keys, 5820 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5820 writes, 845 syncs, 6.89 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 104 writes, 322 keys, 104 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s#012Interval WAL: 104 writes, 42 syncs, 2.48 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:39:51 localhost podman[284528]: 2026-02-20 09:39:51.828854092 +0000 UTC m=+0.097351642 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute) Feb 20 04:39:51 localhost podman[284528]: 2026-02-20 09:39:51.868079294 +0000 UTC m=+0.136576834 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Feb 20 04:39:51 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:39:53 localhost nova_compute[281288]: 2026-02-20 09:39:53.876 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:54 localhost nova_compute[281288]: 2026-02-20 09:39:54.699 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:56 localhost openstack_network_exporter[244414]: ERROR 09:39:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:39:56 localhost openstack_network_exporter[244414]: Feb 20 04:39:56 localhost openstack_network_exporter[244414]: ERROR 09:39:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:39:56 localhost openstack_network_exporter[244414]: Feb 20 04:39:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:39:58 localhost podman[284547]: 2026-02-20 09:39:58.14132832 +0000 UTC m=+0.080600291 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:39:58 localhost podman[284547]: 2026-02-20 09:39:58.14935239 +0000 UTC m=+0.088624351 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:39:58 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:39:58 localhost nova_compute[281288]: 2026-02-20 09:39:58.881 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:39:59 localhost nova_compute[281288]: 2026-02-20 09:39:59.701 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:01 localhost sshd[284569]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:40:03 localhost nova_compute[281288]: 2026-02-20 09:40:03.884 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:04 localhost nova_compute[281288]: 2026-02-20 09:40:04.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:40:05 localhost podman[284571]: 2026-02-20 09:40:05.143051715 +0000 UTC m=+0.081090357 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:40:05 localhost podman[284571]: 2026-02-20 09:40:05.158811016 +0000 UTC m=+0.096849638 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:40:05 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:40:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:40:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:40:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:40:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:40:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:40:06.004 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:40:08 localhost podman[284611]: 2026-02-20 09:40:08.878174939 +0000 UTC m=+0.091956565 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vcs-type=git, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:40:08 localhost nova_compute[281288]: 2026-02-20 09:40:08.885 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:08 localhost podman[284611]: 2026-02-20 09:40:08.899049038 +0000 UTC m=+0.112830674 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git) Feb 20 04:40:08 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:40:09 localhost nova_compute[281288]: 2026-02-20 09:40:09.776 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:10 localhost sshd[284632]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:40:11 localhost systemd[1]: tmp-crun.ekaiHt.mount: Deactivated successfully. Feb 20 04:40:11 localhost podman[284634]: 2026-02-20 09:40:11.585975338 +0000 UTC m=+0.090577051 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:40:11 localhost podman[284635]: 2026-02-20 09:40:11.629034539 +0000 UTC m=+0.130109553 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:40:11 localhost podman[284634]: 2026-02-20 09:40:11.656107472 +0000 UTC m=+0.160709195 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127) Feb 20 04:40:11 localhost podman[284635]: 2026-02-20 09:40:11.663148122 +0000 UTC m=+0.164223156 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true) Feb 20 04:40:11 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:40:11 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:40:13 localhost nova_compute[281288]: 2026-02-20 09:40:13.888 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:14 localhost nova_compute[281288]: 2026-02-20 09:40:14.809 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:16 localhost systemd[1]: session-63.scope: Deactivated successfully. Feb 20 04:40:16 localhost systemd[1]: session-63.scope: Consumed 1.343s CPU time. Feb 20 04:40:16 localhost systemd-logind[759]: Session 63 logged out. Waiting for processes to exit. Feb 20 04:40:16 localhost systemd-logind[759]: Removed session 63. Feb 20 04:40:17 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Feb 20 04:40:17 localhost podman[241968]: time="2026-02-20T09:40:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:40:17 localhost podman[241968]: @ - - [20/Feb/2026:09:40:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150740 "" "Go-http-client/1.1" Feb 20 04:40:17 localhost podman[241968]: @ - - [20/Feb/2026:09:40:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17282 "" "Go-http-client/1.1" Feb 20 04:40:18 localhost nova_compute[281288]: 2026-02-20 09:40:18.891 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:19 localhost nova_compute[281288]: 2026-02-20 09:40:19.845 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:40:22 localhost podman[284677]: 2026-02-20 09:40:22.151501878 +0000 UTC m=+0.090403047 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:40:22 localhost podman[284677]: 2026-02-20 09:40:22.165980819 +0000 UTC m=+0.104882018 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:40:22 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:40:23 localhost podman[284790]: Feb 20 04:40:23 localhost podman[284790]: 2026-02-20 09:40:23.546925215 +0000 UTC m=+0.076331469 container create deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1770267347, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph) Feb 20 04:40:23 localhost systemd[1]: Started libpod-conmon-deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082.scope. Feb 20 04:40:23 localhost systemd[1]: Started libcrun container. Feb 20 04:40:23 localhost podman[284790]: 2026-02-20 09:40:23.517847959 +0000 UTC m=+0.047254233 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:40:23 localhost podman[284790]: 2026-02-20 09:40:23.627859396 +0000 UTC m=+0.157265630 container init deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7) Feb 20 04:40:23 localhost podman[284790]: 2026-02-20 09:40:23.647873589 +0000 UTC m=+0.177279823 container start deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:40:23 localhost podman[284790]: 2026-02-20 09:40:23.648630272 +0000 UTC m=+0.178036546 container attach deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_CLEAN=True, RELEASE=main, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, distribution-scope=public) Feb 20 04:40:23 localhost hungry_kare[284805]: 167 167 Feb 20 04:40:23 localhost systemd[1]: libpod-deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082.scope: Deactivated successfully. Feb 20 04:40:23 localhost podman[284790]: 2026-02-20 09:40:23.652689048 +0000 UTC m=+0.182095302 container died deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:40:23 localhost podman[284810]: 2026-02-20 09:40:23.75644754 +0000 UTC m=+0.089647492 container remove deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, name=rhceph, ceph=True, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Feb 20 04:40:23 localhost systemd[1]: libpod-conmon-deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082.scope: Deactivated successfully. Feb 20 04:40:23 localhost nova_compute[281288]: 2026-02-20 09:40:23.894 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:23 localhost podman[284832]: Feb 20 04:40:23 localhost podman[284832]: 2026-02-20 09:40:23.98954744 +0000 UTC m=+0.077951829 container create 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., io.buildah.version=1.42.2, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True) Feb 20 04:40:24 localhost systemd[1]: Started libpod-conmon-48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32.scope. Feb 20 04:40:24 localhost systemd[1]: Started libcrun container. Feb 20 04:40:24 localhost podman[284832]: 2026-02-20 09:40:23.959620607 +0000 UTC m=+0.048025016 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:40:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 20 04:40:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 04:40:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:40:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 04:40:24 localhost podman[284832]: 2026-02-20 09:40:24.070411548 +0000 UTC m=+0.158815907 container init 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, RELEASE=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True) Feb 20 04:40:24 localhost podman[284832]: 2026-02-20 09:40:24.080839432 +0000 UTC m=+0.169243801 container start 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1770267347, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public) Feb 20 04:40:24 localhost podman[284832]: 2026-02-20 09:40:24.0810928 +0000 UTC m=+0.169497199 container attach 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, ceph=True, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 20 04:40:24 localhost systemd[1]: tmp-crun.6hsQM0.mount: Deactivated successfully. Feb 20 04:40:24 localhost systemd[1]: var-lib-containers-storage-overlay-6f0f50769b2da77e064c5711c04601da71302746682bc5484215241e6fe690ea-merged.mount: Deactivated successfully. Feb 20 04:40:24 localhost nova_compute[281288]: 2026-02-20 09:40:24.879 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:25 localhost priceless_swirles[284847]: [ Feb 20 04:40:25 localhost priceless_swirles[284847]: { Feb 20 04:40:25 localhost priceless_swirles[284847]: "available": false, Feb 20 04:40:25 localhost priceless_swirles[284847]: "ceph_device": false, Feb 20 04:40:25 localhost priceless_swirles[284847]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 20 04:40:25 localhost priceless_swirles[284847]: "lsm_data": {}, Feb 20 04:40:25 localhost priceless_swirles[284847]: "lvs": [], Feb 20 04:40:25 localhost priceless_swirles[284847]: "path": "/dev/sr0", Feb 20 04:40:25 localhost priceless_swirles[284847]: "rejected_reasons": [ Feb 20 04:40:25 localhost priceless_swirles[284847]: "Has a FileSystem", Feb 20 04:40:25 localhost priceless_swirles[284847]: "Insufficient space (<5GB)" Feb 20 04:40:25 localhost priceless_swirles[284847]: ], Feb 20 04:40:25 localhost priceless_swirles[284847]: "sys_api": { Feb 20 04:40:25 localhost priceless_swirles[284847]: "actuators": null, Feb 20 04:40:25 localhost priceless_swirles[284847]: "device_nodes": "sr0", Feb 20 04:40:25 localhost priceless_swirles[284847]: "human_readable_size": "482.00 KB", Feb 20 04:40:25 localhost priceless_swirles[284847]: "id_bus": "ata", Feb 20 04:40:25 localhost priceless_swirles[284847]: "model": "QEMU DVD-ROM", Feb 20 04:40:25 localhost priceless_swirles[284847]: "nr_requests": "2", Feb 20 04:40:25 localhost priceless_swirles[284847]: "partitions": {}, Feb 20 04:40:25 localhost priceless_swirles[284847]: "path": "/dev/sr0", Feb 20 04:40:25 localhost priceless_swirles[284847]: "removable": "1", Feb 20 04:40:25 localhost priceless_swirles[284847]: "rev": "2.5+", Feb 20 04:40:25 localhost priceless_swirles[284847]: "ro": "0", Feb 20 04:40:25 localhost priceless_swirles[284847]: "rotational": "1", Feb 20 04:40:25 localhost priceless_swirles[284847]: "sas_address": "", Feb 20 04:40:25 localhost priceless_swirles[284847]: "sas_device_handle": "", Feb 20 04:40:25 localhost priceless_swirles[284847]: "scheduler_mode": "mq-deadline", Feb 20 04:40:25 localhost priceless_swirles[284847]: "sectors": 0, Feb 20 04:40:25 localhost priceless_swirles[284847]: "sectorsize": "2048", Feb 20 04:40:25 localhost priceless_swirles[284847]: "size": 493568.0, Feb 20 04:40:25 localhost priceless_swirles[284847]: "support_discard": "0", Feb 20 04:40:25 localhost priceless_swirles[284847]: "type": "disk", Feb 20 04:40:25 localhost priceless_swirles[284847]: "vendor": "QEMU" Feb 20 04:40:25 localhost priceless_swirles[284847]: } Feb 20 04:40:25 localhost priceless_swirles[284847]: } Feb 20 04:40:25 localhost priceless_swirles[284847]: ] Feb 20 04:40:25 localhost systemd[1]: libpod-48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32.scope: Deactivated successfully. Feb 20 04:40:25 localhost podman[286553]: 2026-02-20 09:40:25.121267964 +0000 UTC m=+0.046017744 container died 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.42.2, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:40:25 localhost systemd[1]: var-lib-containers-storage-overlay-0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232-merged.mount: Deactivated successfully. Feb 20 04:40:25 localhost podman[286553]: 2026-02-20 09:40:25.169835847 +0000 UTC m=+0.094585577 container remove 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , name=rhceph, io.buildah.version=1.42.2, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z) Feb 20 04:40:25 localhost systemd[1]: libpod-conmon-48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32.scope: Deactivated successfully. Feb 20 04:40:26 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 20 04:40:26 localhost systemd[283296]: Activating special unit Exit the Session... Feb 20 04:40:26 localhost systemd[283296]: Stopped target Main User Target. Feb 20 04:40:26 localhost systemd[283296]: Stopped target Basic System. Feb 20 04:40:26 localhost systemd[283296]: Stopped target Paths. Feb 20 04:40:26 localhost systemd[283296]: Stopped target Sockets. Feb 20 04:40:26 localhost systemd[283296]: Stopped target Timers. Feb 20 04:40:26 localhost systemd[283296]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 20 04:40:26 localhost systemd[283296]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 04:40:26 localhost systemd[283296]: Closed D-Bus User Message Bus Socket. Feb 20 04:40:26 localhost systemd[283296]: Stopped Create User's Volatile Files and Directories. Feb 20 04:40:26 localhost systemd[283296]: Removed slice User Application Slice. Feb 20 04:40:26 localhost systemd[283296]: Reached target Shutdown. Feb 20 04:40:26 localhost systemd[283296]: Finished Exit the Session. Feb 20 04:40:26 localhost systemd[283296]: Reached target Exit the Session. Feb 20 04:40:26 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 20 04:40:26 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 20 04:40:26 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 20 04:40:26 localhost openstack_network_exporter[244414]: ERROR 09:40:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:40:26 localhost openstack_network_exporter[244414]: Feb 20 04:40:26 localhost openstack_network_exporter[244414]: ERROR 09:40:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:40:26 localhost openstack_network_exporter[244414]: Feb 20 04:40:26 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 20 04:40:26 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 20 04:40:26 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 20 04:40:26 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 20 04:40:26 localhost systemd[1]: user-1003.slice: Consumed 1.778s CPU time. Feb 20 04:40:28 localhost nova_compute[281288]: 2026-02-20 09:40:28.897 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:40:29 localhost podman[286605]: 2026-02-20 09:40:29.152350324 +0000 UTC m=+0.087652552 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:40:29 localhost podman[286605]: 2026-02-20 09:40:29.163103999 +0000 UTC m=+0.098406277 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:40:29 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:40:29 localhost nova_compute[281288]: 2026-02-20 09:40:29.920 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:33 localhost nova_compute[281288]: 2026-02-20 09:40:33.900 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:34 localhost nova_compute[281288]: 2026-02-20 09:40:34.969 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:40:36 localhost podman[286629]: 2026-02-20 09:40:36.162189201 +0000 UTC m=+0.100052477 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:40:36 localhost podman[286629]: 2026-02-20 09:40:36.172998147 +0000 UTC m=+0.110861413 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:40:36 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:40:38 localhost nova_compute[281288]: 2026-02-20 09:40:38.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:40:39 localhost podman[286652]: 2026-02-20 09:40:39.149134413 +0000 UTC m=+0.085934588 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter) Feb 20 04:40:39 localhost podman[286652]: 2026-02-20 09:40:39.191965537 +0000 UTC m=+0.128765692 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:40:39 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:40:40 localhost nova_compute[281288]: 2026-02-20 09:40:40.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:40 localhost sshd[286672]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:40:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:40:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:40:42 localhost podman[286674]: 2026-02-20 09:40:42.153821067 +0000 UTC m=+0.087272639 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:40:42 localhost podman[286675]: 2026-02-20 09:40:42.219203143 +0000 UTC m=+0.148833126 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:40:42 localhost podman[286675]: 2026-02-20 09:40:42.230071371 +0000 UTC m=+0.159701394 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Feb 20 04:40:42 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:40:42 localhost podman[286674]: 2026-02-20 09:40:42.286321333 +0000 UTC m=+0.219772915 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller) Feb 20 04:40:42 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:40:42 localhost nova_compute[281288]: 2026-02-20 09:40:42.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:42 localhost nova_compute[281288]: 2026-02-20 09:40:42.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 04:40:42 localhost nova_compute[281288]: 2026-02-20 09:40:42.738 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 04:40:42 localhost nova_compute[281288]: 2026-02-20 09:40:42.738 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:42 localhost nova_compute[281288]: 2026-02-20 09:40:42.738 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 04:40:42 localhost nova_compute[281288]: 2026-02-20 09:40:42.753 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:43 localhost nova_compute[281288]: 2026-02-20 09:40:43.766 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:44 localhost nova_compute[281288]: 2026-02-20 09:40:44.641 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:44 localhost nova_compute[281288]: 2026-02-20 09:40:44.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:44 localhost nova_compute[281288]: 2026-02-20 09:40:44.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.017 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:40:45 localhost nova_compute[281288]: 2026-02-20 09:40:45.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.234 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.311 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.312 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.499 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.500 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12312MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.500 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.501 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.599 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.600 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.600 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.639 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.712 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.713 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.727 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.750 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:40:46 localhost nova_compute[281288]: 2026-02-20 09:40:46.794 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:40:47 localhost nova_compute[281288]: 2026-02-20 09:40:47.281 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:40:47 localhost nova_compute[281288]: 2026-02-20 09:40:47.288 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:40:47 localhost nova_compute[281288]: 2026-02-20 09:40:47.306 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:40:47 localhost nova_compute[281288]: 2026-02-20 09:40:47.309 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:40:47 localhost nova_compute[281288]: 2026-02-20 09:40:47.309 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:40:47 localhost podman[241968]: time="2026-02-20T09:40:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:40:47 localhost podman[241968]: @ - - [20/Feb/2026:09:40:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150740 "" "Go-http-client/1.1" Feb 20 04:40:47 localhost podman[241968]: @ - - [20/Feb/2026:09:40:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17293 "" "Go-http-client/1.1" Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.307 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.308 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.308 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.309 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.924 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.925 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.925 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:40:49 localhost nova_compute[281288]: 2026-02-20 09:40:49.925 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:40:50 localhost nova_compute[281288]: 2026-02-20 09:40:50.019 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:50 localhost nova_compute[281288]: 2026-02-20 09:40:50.303 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:40:50 localhost nova_compute[281288]: 2026-02-20 09:40:50.319 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:40:50 localhost nova_compute[281288]: 2026-02-20 09:40:50.319 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:40:50 localhost nova_compute[281288]: 2026-02-20 09:40:50.320 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:40:50 localhost nova_compute[281288]: 2026-02-20 09:40:50.320 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:40:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:40:52 localhost podman[286780]: 2026-02-20 09:40:52.646874979 +0000 UTC m=+0.086100132 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:40:52 localhost podman[286780]: 2026-02-20 09:40:52.657910173 +0000 UTC m=+0.097135346 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:40:52 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:40:54 localhost nova_compute[281288]: 2026-02-20 09:40:54.681 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:55 localhost nova_compute[281288]: 2026-02-20 09:40:55.021 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:40:56 localhost openstack_network_exporter[244414]: ERROR 09:40:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:40:56 localhost openstack_network_exporter[244414]: Feb 20 04:40:56 localhost openstack_network_exporter[244414]: ERROR 09:40:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:40:56 localhost openstack_network_exporter[244414]: Feb 20 04:40:59 localhost nova_compute[281288]: 2026-02-20 09:40:59.719 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:00 localhost nova_compute[281288]: 2026-02-20 09:41:00.023 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:41:00 localhost systemd[1]: tmp-crun.GP6eGb.mount: Deactivated successfully. Feb 20 04:41:00 localhost podman[286902]: 2026-02-20 09:41:00.153391965 +0000 UTC m=+0.096038802 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:41:00 localhost podman[286902]: 2026-02-20 09:41:00.163177959 +0000 UTC m=+0.105824876 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:41:00 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:41:01 localhost podman[287002]: Feb 20 04:41:01 localhost podman[287002]: 2026-02-20 09:41:01.636926747 +0000 UTC m=+0.081160568 container create 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main) Feb 20 04:41:01 localhost systemd[1]: Started libpod-conmon-51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f.scope. Feb 20 04:41:01 localhost systemd[1]: Started libcrun container. Feb 20 04:41:01 localhost podman[287002]: 2026-02-20 09:41:01.602067131 +0000 UTC m=+0.046301022 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:41:01 localhost podman[287002]: 2026-02-20 09:41:01.715919557 +0000 UTC m=+0.160153368 container init 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:41:01 localhost podman[287002]: 2026-02-20 09:41:01.727760966 +0000 UTC m=+0.171994777 container start 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Feb 20 04:41:01 localhost podman[287002]: 2026-02-20 09:41:01.728126107 +0000 UTC m=+0.172359968 container attach 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, build-date=2026-02-09T10:25:24Z, release=1770267347, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.42.2) Feb 20 04:41:01 localhost systemd[1]: libpod-51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f.scope: Deactivated successfully. Feb 20 04:41:01 localhost goofy_mendeleev[287017]: 167 167 Feb 20 04:41:01 localhost podman[287002]: 2026-02-20 09:41:01.735699893 +0000 UTC m=+0.179933734 container died 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Feb 20 04:41:01 localhost podman[287022]: 2026-02-20 09:41:01.843033375 +0000 UTC m=+0.093517003 container remove 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Feb 20 04:41:01 localhost systemd[1]: libpod-conmon-51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f.scope: Deactivated successfully. Feb 20 04:41:01 localhost systemd[1]: Reloading. Feb 20 04:41:02 localhost systemd-rc-local-generator[287058]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:41:02 localhost systemd-sysv-generator[287062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: var-lib-containers-storage-overlay-173e2ff66011b02bc37c533e2757c4ccf663dd3e2e6bdce4052d9039c460a794-merged.mount: Deactivated successfully. Feb 20 04:41:02 localhost systemd[1]: Reloading. Feb 20 04:41:02 localhost systemd-rc-local-generator[287104]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:41:02 localhost systemd-sysv-generator[287108]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:02 localhost systemd[1]: Starting Ceph mgr.np0005625204.exgrzx for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 04:41:02 localhost podman[287168]: Feb 20 04:41:03 localhost podman[287168]: 2026-02-20 09:41:03.010334069 +0000 UTC m=+0.084206484 container create 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1770267347, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_BRANCH=main) Feb 20 04:41:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/var/lib/ceph/mgr/ceph-np0005625204.exgrzx supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:03 localhost podman[287168]: 2026-02-20 09:41:02.975180233 +0000 UTC m=+0.049052688 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:41:03 localhost podman[287168]: 2026-02-20 09:41:03.081932738 +0000 UTC m=+0.155805153 container init 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx, GIT_BRANCH=main, ceph=True, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 20 04:41:03 localhost podman[287168]: 2026-02-20 09:41:03.089502724 +0000 UTC m=+0.163375139 container start 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, RELEASE=main, architecture=x86_64) Feb 20 04:41:03 localhost bash[287168]: 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 Feb 20 04:41:03 localhost systemd[1]: Started Ceph mgr.np0005625204.exgrzx for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 04:41:03 localhost ceph-mgr[287186]: set uid:gid to 167:167 (ceph:ceph) Feb 20 04:41:03 localhost ceph-mgr[287186]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2 Feb 20 04:41:03 localhost ceph-mgr[287186]: pidfile_write: ignore empty --pid-file Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Loading python module 'alerts' Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Loading python module 'balancer' Feb 20 04:41:03 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:03.266+0000 7f0946085140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Loading python module 'cephadm' Feb 20 04:41:03 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:03.337+0000 7f0946085140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Loading python module 'crash' Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Module crash has missing NOTIFY_TYPES member Feb 20 04:41:03 localhost ceph-mgr[287186]: mgr[py] Loading python module 'dashboard' Feb 20 04:41:03 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:03.980+0000 7f0946085140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Loading python module 'devicehealth' Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Loading python module 'diskprediction_local' Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.545+0000 7f0946085140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: from numpy import show_config as show_numpy_config Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.690+0000 7f0946085140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Loading python module 'influx' Feb 20 04:41:04 localhost nova_compute[281288]: 2026-02-20 09:41:04.770 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Module influx has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Loading python module 'insights' Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.797+0000 7f0946085140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Loading python module 'iostat' Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost ceph-mgr[287186]: mgr[py] Loading python module 'k8sevents' Feb 20 04:41:04 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.915+0000 7f0946085140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 20 04:41:04 localhost systemd[1]: tmp-crun.3i0FaH.mount: Deactivated successfully. Feb 20 04:41:04 localhost podman[287343]: 2026-02-20 09:41:04.967949314 +0000 UTC m=+0.108996145 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=) Feb 20 04:41:05 localhost nova_compute[281288]: 2026-02-20 09:41:05.025 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:05 localhost podman[287343]: 2026-02-20 09:41:05.115176819 +0000 UTC m=+0.256223680 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, architecture=x86_64, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public) Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'localpool' Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'mds_autoscaler' Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'mirroring' Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'nfs' Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'orchestrator' Feb 20 04:41:05 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.634+0000 7f0946085140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'osd_perf_query' Feb 20 04:41:05 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.774+0000 7f0946085140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'osd_support' Feb 20 04:41:05 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.837+0000 7f0946085140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.890+0000 7f0946085140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'pg_autoscaler' Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 20 04:41:05 localhost ceph-mgr[287186]: mgr[py] Loading python module 'progress' Feb 20 04:41:05 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.956+0000 7f0946085140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:41:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:41:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:41:06.004 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:41:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:41:06.005 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Module progress has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Loading python module 'prometheus' Feb 20 04:41:06 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.013+0000 7f0946085140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.303+0000 7f0946085140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Loading python module 'rbd_support' Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Loading python module 'restful' Feb 20 04:41:06 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.381+0000 7f0946085140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Loading python module 'rgw' Feb 20 04:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost ceph-mgr[287186]: mgr[py] Loading python module 'rook' Feb 20 04:41:06 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.698+0000 7f0946085140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 20 04:41:06 localhost podman[287480]: 2026-02-20 09:41:06.722219277 +0000 UTC m=+0.094720521 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:41:06 localhost podman[287480]: 2026-02-20 09:41:06.738140092 +0000 UTC m=+0.110641316 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:41:06 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module rook has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'selftest' Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.095+0000 7f0946085140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'snap_schedule' Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.155+0000 7f0946085140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'stats' Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'status' Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module status has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'telegraf' Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.340+0000 7f0946085140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'telemetry' Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.397+0000 7f0946085140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'test_orchestrator' Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.527+0000 7f0946085140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'volumes' Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.668+0000 7f0946085140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost sshd[287521]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.850+0000 7f0946085140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Loading python module 'zabbix' Feb 20 04:41:07 localhost ceph-mgr[287186]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.908+0000 7f0946085140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 20 04:41:07 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd1e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 20 04:41:07 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1308191220 Feb 20 04:41:09 localhost nova_compute[281288]: 2026-02-20 09:41:09.814 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:10 localhost nova_compute[281288]: 2026-02-20 09:41:10.026 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:41:10 localhost podman[287523]: 2026-02-20 09:41:10.136440005 +0000 UTC m=+0.074867482 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:41:10 localhost podman[287523]: 2026-02-20 09:41:10.153085083 +0000 UTC m=+0.091512570 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 20 04:41:10 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:41:10 localhost sshd[287544]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:41:10 localhost sshd[287546]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:41:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:41:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:41:13 localhost systemd[1]: tmp-crun.8vYZug.mount: Deactivated successfully. Feb 20 04:41:13 localhost podman[287566]: 2026-02-20 09:41:13.140118959 +0000 UTC m=+0.079970512 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:41:13 localhost podman[287566]: 2026-02-20 09:41:13.180091304 +0000 UTC m=+0.119942887 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:41:13 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:41:13 localhost podman[287567]: 2026-02-20 09:41:13.185494902 +0000 UTC m=+0.123175637 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:41:13 localhost podman[287567]: 2026-02-20 09:41:13.268031992 +0000 UTC m=+0.205712757 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 20 04:41:13 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:41:14 localhost nova_compute[281288]: 2026-02-20 09:41:14.856 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:15 localhost nova_compute[281288]: 2026-02-20 09:41:15.028 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:17 localhost podman[241968]: time="2026-02-20T09:41:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:41:17 localhost podman[241968]: @ - - [20/Feb/2026:09:41:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152934 "" "Go-http-client/1.1" Feb 20 04:41:17 localhost podman[241968]: @ - - [20/Feb/2026:09:41:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17776 "" "Go-http-client/1.1" Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.204 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.209 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0ca0f87-9326-4bf6-971c-c2d8292d7d47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.205498', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e73c3fc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'e51641bcb468ffb933d16eb86f1f024166de44bf7e4fb7ab83100b10af79912a'}]}, 'timestamp': '2026-02-20 09:41:18.210353', '_unique_id': '9d4f5bba6724492789c735c4e029a740'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.240 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.241 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25c9448b-18ff-4876-85bb-7179ba1e7868', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.212322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e788892-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '1bf19394436b84ba7577e6130026489f516f899e653a32c5a7d3e4866a2ad725'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.212322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7896a2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '176cadc1cafd4615f50c9531ed17736e0f78fa2971f1c132595e7dd763347538'}]}, 'timestamp': '2026-02-20 09:41:18.241911', '_unique_id': 'f22d99f77a69441b8deded7550b5ddfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '929d4bb7-9a18-4417-94d3-81619aa42dce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.243958', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e78f296-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'd13b47b024b7e5639fe686247d8a59f053c1699dd8382a7416e9b7ea02cc8b00'}]}, 'timestamp': '2026-02-20 09:41:18.244293', '_unique_id': '3fc304af228c476395ebe3c8b486a32e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.245 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.245 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66881217-4ea4-49d2-8314-6ce4dc24e7fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.245626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e793422-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '44299dd4e7107be0e9e77dbac4bcd1eafe7a4d7383daa029d58bb21e4b81834d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.245626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e793e18-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'f6d13830dd37cd926140047736578a176eb8d8cd5948ffdbd84832868cb0ad1c'}]}, 'timestamp': '2026-02-20 09:41:18.246176', '_unique_id': 'd0fe15a07bba4384bfbffe31c4c67e39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.257 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e674b1f-1f6a-42c8-999f-4f85d0551764', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.247506', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7b18d2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': 'ac4ebb4bf1667f2151e86d3a6f350ca45a49d8c3781b0464f6d678dd8486be6d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.247506', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7b249e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': 'df0b99bec8216efd096691d19a77751a20d642ad8be4d63e154eb5a5c6285b54'}]}, 'timestamp': '2026-02-20 09:41:18.258663', '_unique_id': '50fc6cf364d042d991999e4af2a4109c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.260 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74680d83-7c93-44f6-9cce-0d89e528f807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.260425', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7b75e8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '4bc7e1e7acddbd8752ae5c159b3890b0464db04da385d902ea9372a841b5a11f'}]}, 'timestamp': '2026-02-20 09:41:18.260763', '_unique_id': 'bdb851dd57b74f82b734f23ed063582a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f41f0410-3fee-447e-99f9-9487fc00617f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.262115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7bb756-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': '626b2affd87870526ed91f86a1fda02f80079de80b81d812586624fc8d7ff5a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.262115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7bc17e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': '0cf99cafa2d6304250d9eea04d9c8f1a7388db28c5a0680aec00b573b5409512'}]}, 'timestamp': '2026-02-20 09:41:18.262660', '_unique_id': 'ae578f14de1447fd8eba820188f15dd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f664eeb-02ea-4cbf-a06c-e6234bf9b0fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.263985', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7c0080-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'b69b978cd8c61ca68c8b5c88c211756b6136c9099a710ca0ed541ec421487811'}]}, 'timestamp': '2026-02-20 09:41:18.264297', '_unique_id': '7475cbb0d20c47a28a8dddae99c776c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.265 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da48e571-2403-4652-94cf-e8dd90e4c4e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.265781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7c468a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '6dcb1cedea515aac49947d7e27d5cd6877e01d7f306d4ed37e3b52bfd13ed15f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.265781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7c50b2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '4f67a15de5ac3568e20a8116b748ed852b14ea77f2fbcee29d53150b0a53b675'}]}, 'timestamp': '2026-02-20 09:41:18.266314', '_unique_id': '5490b2aeb11942dfb5ebedb49e84ea61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c2de212-c6ce-4b60-8a1b-69d24ba5acf3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.267612', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7c8ece-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'bd5759504a0607975fb244bb317e884cfe5a3369fedb26ca4448d3232d41d018'}]}, 'timestamp': '2026-02-20 09:41:18.267920', '_unique_id': 'be5ec14ca0d14c7f919013e0d92d8279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.269 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.285 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '783080da-111d-43a8-80bb-2a6303b3fe55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:41:18.269197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4e7f6090-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.52498042, 'message_signature': 'c2eddadf6e12bbcfca79496484eda8ded3a333271fd18282b4a57cb2d94bd152'}]}, 'timestamp': '2026-02-20 09:41:18.286450', '_unique_id': '11b4b4ada7344d91a3efa9b4e446a7cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.288 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dd6fd65-4fd8-4c15-a67f-d7b91c9f40bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.288251', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7fb4aa-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '9edbb0387b876dde9cd4a89792f19c1474abe29decfff6c18b0dda2383039128'}]}, 'timestamp': '2026-02-20 09:41:18.288559', '_unique_id': 'c424baaba3c8479f9da27e762cce030b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b84c887-0d44-47c9-abc0-68b783df62a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.289904', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7ff4ba-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'ce727c06d3fd82a81ff932965a6cac3834122da2d5953a1a7133c533dec05462'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.289904', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7ffee2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'b0cb146a8f4e9a9961048ab80b3f2364f5ce52eb35fdead2bc39036736454f45'}]}, 'timestamp': '2026-02-20 09:41:18.290437', '_unique_id': '67cb37d2f71545e994b6323ce14b1b05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 13400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8f5fadb-8bb8-415c-8aaf-b62d11d25e22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13400000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:41:18.291773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4e803dee-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.52498042, 'message_signature': '7fe951bdca0fab0ab91435da8f83d72319d1a33652e55cd56cff46065c20cd05'}]}, 'timestamp': '2026-02-20 09:41:18.292058', '_unique_id': '45080121cc8c4a54a32355f25bb549ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72bf3aca-6cbc-49ea-a91a-ecb5f980cfa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.293360', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e807be2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '24995bcbef417870006e5a70f9f8ebe61a60c44887303e21a353b106e4c052d9'}]}, 'timestamp': '2026-02-20 09:41:18.293669', '_unique_id': '88a3c0bdf6b94cbfaffbf217e2328175'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.295 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fc51fd9-40ea-4d6a-b518-571b42a6b8a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.294945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e80b99a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '37a6c85e9636b72f2add9d0dd792879dd8974959b432186e1e90dace43d1e80c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.294945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e80c368-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'faebb5f3ca4fa834ab27da47bbb6073f9600537d30ea535f2049167a8d6fb919'}]}, 'timestamp': '2026-02-20 09:41:18.295467', '_unique_id': 'c8608659408c4945bb93c993deb2587a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb4d616-f2b3-4def-ac76-4d6546fe9c91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.296895', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e81059e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': '5af7f7d962e28109ea9bf220919fe03a2af5a6ad944f63b2637b59b5d19b82db'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.296895', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e810fb2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': 'e766ace757b2c49e776c1efc857cf8a9c2ff293229c3aa70df5d4deb30541f4b'}]}, 'timestamp': '2026-02-20 09:41:18.297418', '_unique_id': '191ab5a2994a43ebbeeacae20a26b637'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.298 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.298 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0883aa5e-dac5-4805-aae2-95c3ff7913b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.298821', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e815116-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'b183a7cb436eecdb85b8bcfa91dbeed4bb05b2630e17b8ebc7721c8efc93dbe6'}]}, 'timestamp': '2026-02-20 09:41:18.299109', '_unique_id': '933ba66d50f24b07a3f06d36f1204cbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.300 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.300 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e62c15e5-d029-45a8-ada2-0df81b44b8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.300408', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e818f0a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'f45dc91944a8c438f128e27dfea5853726e68cfae97fcd380f2f0babb2ad51fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.300408', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e8199b4-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '300c944a4778e461e0402225cb3e7f1023a0ea140c56db331a1edbc238e3df24'}]}, 'timestamp': '2026-02-20 09:41:18.300953', '_unique_id': 'ee0f3875cc994c8396708b575c3d33a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.302 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0d31d06-781b-413b-8cd3-833e88eb1f50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.302319', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e81d9a6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'd82c754db8e56c98048393f8071ed2fc4c15dd0f27535ff708b01755eb930a22'}]}, 'timestamp': '2026-02-20 09:41:18.302603', '_unique_id': 'd4867395cb304a5d808714a9c7215d91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83f7e59e-c374-4009-8091-2877410b276d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.303962', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e8219a2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '25accc19377fb033ee8ce98debeaf265b3e3924c5d7cfcd35d3573611f5eedf8'}]}, 'timestamp': '2026-02-20 09:41:18.304260', '_unique_id': 'efda48d1d86e4425b235c1baa8679a23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:41:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:41:18 localhost podman[288347]: Feb 20 04:41:18 localhost podman[288347]: 2026-02-20 09:41:18.365774169 +0000 UTC m=+0.076551324 container create 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.42.2, release=1770267347, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:41:18 localhost systemd[1]: Started libpod-conmon-3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c.scope. Feb 20 04:41:18 localhost systemd[1]: Started libcrun container. Feb 20 04:41:18 localhost podman[288347]: 2026-02-20 09:41:18.433480788 +0000 UTC m=+0.144257943 container init 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.42.2, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:41:18 localhost podman[288347]: 2026-02-20 09:41:18.335512107 +0000 UTC m=+0.046289282 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:41:18 localhost podman[288347]: 2026-02-20 09:41:18.441926591 +0000 UTC m=+0.152703736 container start 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:41:18 localhost podman[288347]: 2026-02-20 09:41:18.442359234 +0000 UTC m=+0.153136429 container attach 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, vcs-type=git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:41:18 localhost magical_blackburn[288363]: 167 167 Feb 20 04:41:18 localhost systemd[1]: libpod-3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c.scope: Deactivated successfully. Feb 20 04:41:18 localhost podman[288347]: 2026-02-20 09:41:18.445200093 +0000 UTC m=+0.155977278 container died 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux , architecture=x86_64, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public) Feb 20 04:41:18 localhost podman[288368]: 2026-02-20 09:41:18.529610622 +0000 UTC m=+0.074435199 container remove 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1770267347, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7) Feb 20 04:41:18 localhost systemd[1]: libpod-conmon-3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c.scope: Deactivated successfully. Feb 20 04:41:18 localhost podman[288386]: Feb 20 04:41:18 localhost podman[288386]: 2026-02-20 09:41:18.617799698 +0000 UTC m=+0.056044026 container create edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, architecture=x86_64, ceph=True, GIT_CLEAN=True, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:41:18 localhost systemd[1]: Started libpod-conmon-edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc.scope. Feb 20 04:41:18 localhost systemd[1]: Started libcrun container. Feb 20 04:41:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:18 localhost podman[288386]: 2026-02-20 09:41:18.674559836 +0000 UTC m=+0.112804204 container init edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, ceph=True, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, release=1770267347, maintainer=Guillaume Abrioux ) Feb 20 04:41:18 localhost podman[288386]: 2026-02-20 09:41:18.682358189 +0000 UTC m=+0.120602537 container start edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, architecture=x86_64, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:41:18 localhost podman[288386]: 2026-02-20 09:41:18.682595376 +0000 UTC m=+0.120839754 container attach edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, vcs-type=git, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.42.2, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph) Feb 20 04:41:18 localhost podman[288386]: 2026-02-20 09:41:18.590087555 +0000 UTC m=+0.028331953 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:41:18 localhost systemd[1]: libpod-edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc.scope: Deactivated successfully. Feb 20 04:41:18 localhost podman[288386]: 2026-02-20 09:41:18.774098466 +0000 UTC m=+0.212342854 container died edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.42.2, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:41:18 localhost podman[288427]: 2026-02-20 09:41:18.846675616 +0000 UTC m=+0.062309761 container remove edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc.) Feb 20 04:41:18 localhost systemd[1]: libpod-conmon-edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc.scope: Deactivated successfully. Feb 20 04:41:18 localhost systemd[1]: Reloading. Feb 20 04:41:18 localhost systemd-rc-local-generator[288465]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:41:18 localhost systemd-sysv-generator[288470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: var-lib-containers-storage-overlay-f5184decee9029c694779f63b9e03a50095e53d0d479941ae3f2c6f3e74347d7-merged.mount: Deactivated successfully. Feb 20 04:41:19 localhost systemd[1]: Reloading. Feb 20 04:41:19 localhost systemd-sysv-generator[288508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:41:19 localhost systemd-rc-local-generator[288504]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:41:19 localhost systemd[1]: Starting Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 04:41:19 localhost nova_compute[281288]: 2026-02-20 09:41:19.859 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:19 localhost podman[288568]: Feb 20 04:41:20 localhost podman[288568]: 2026-02-20 09:41:20.001439419 +0000 UTC m=+0.073899292 container create 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 20 04:41:20 localhost nova_compute[281288]: 2026-02-20 09:41:20.029 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff) Feb 20 04:41:20 localhost podman[288568]: 2026-02-20 09:41:20.055493932 +0000 UTC m=+0.127953805 container init 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347) Feb 20 04:41:20 localhost podman[288568]: 2026-02-20 09:41:20.065734491 +0000 UTC m=+0.138194364 container start 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1770267347, GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, vcs-type=git, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Feb 20 04:41:20 localhost bash[288568]: 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 Feb 20 04:41:20 localhost podman[288568]: 2026-02-20 09:41:19.971218017 +0000 UTC m=+0.043677940 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:41:20 localhost systemd[1]: Started Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 04:41:20 localhost ceph-mon[288586]: set uid:gid to 167:167 (ceph:ceph) Feb 20 04:41:20 localhost ceph-mon[288586]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2 Feb 20 04:41:20 localhost ceph-mon[288586]: pidfile_write: ignore empty --pid-file Feb 20 04:41:20 localhost ceph-mon[288586]: load: jerasure load: lrc Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: RocksDB version: 7.9.2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Git sha 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: DB SUMMARY Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: DB Session ID: RDMWWACFW9Z8Q9K53AN8 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: CURRENT file: CURRENT Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: IDENTITY file: IDENTITY Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005625204/store.db dir, Total Num: 0, files: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005625204/store.db: 000004.log size: 761 ; Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.error_if_exists: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.create_if_missing: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.paranoid_checks: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.env: 0x562d04acfa20 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.fs: PosixFileSystem Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.info_log: 0x562d057fcd20 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_file_opening_threads: 16 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.statistics: (nil) Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.use_fsync: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_log_file_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.log_file_time_to_roll: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.keep_log_file_num: 1000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.recycle_log_file_num: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.allow_fallocate: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.allow_mmap_reads: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.allow_mmap_writes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.use_direct_reads: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.create_missing_column_families: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.db_log_dir: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.wal_dir: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.table_cache_numshardbits: 6 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.advise_random_on_open: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.db_write_buffer_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.write_buffer_manager: 0x562d0580d540 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.use_adaptive_mutex: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.rate_limiter: (nil) Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.wal_recovery_mode: 2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.enable_thread_tracking: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.enable_pipelined_write: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.unordered_write: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.row_cache: None Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.wal_filter: None Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.allow_ingest_behind: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.two_write_queues: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.manual_wal_flush: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.wal_compression: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.atomic_flush: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.persist_stats_to_disk: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.log_readahead_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.best_efforts_recovery: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.allow_data_in_errors: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.db_host_id: __hostname__ Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.enforce_single_del_contracts: true Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_background_jobs: 2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_background_compactions: -1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_subcompactions: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.delayed_write_rate : 16777216 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_total_wal_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.stats_dump_period_sec: 600 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.stats_persist_period_sec: 600 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_open_files: -1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bytes_per_sync: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_readahead_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_background_flushes: -1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Compression algorithms supported: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kZSTD supported: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kXpressCompression supported: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kBZip2Compression supported: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kLZ4Compression supported: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kZlibCompression supported: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: #011kSnappyCompression supported: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: DMutex implementation: pthread_mutex_t Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.merge_operator: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_filter: None Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_filter_factory: None Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.sst_partitioner_factory: None Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562d057fc980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x562d057f9350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.write_buffer_size: 33554432 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_write_buffer_number: 2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression: NoCompression Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression: Disabled Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.prefix_extractor: nullptr Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.num_levels: 7 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.level: 32767 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.enabled: false Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_base: 268435456 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.arena_block_size: 1048576 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.table_properties_collectors: Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.inplace_update_support: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.bloom_locality: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.max_successive_merges: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.force_consistency_checks: 1 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.ttl: 2592000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.enable_blob_files: false Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.min_blob_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.blob_file_size: 268435456 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ff5418ad-30e3-42a0-9ea4-01185f113ffa Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580480127693, "job": 1, "event": "recovery_started", "wal_files": [4]} Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580480130710, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580480130904, "job": 1, "event": "recovery_finished"} Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562d05820e00 Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: DB pointer 0x562d05916000 Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204 does not exist in monmap, will attempt to join an existing cluster Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:41:20 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.84 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.84 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562d057f9350#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.95 KB,0.000181794%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 20 04:41:20 localhost ceph-mon[288586]: using public_addr v2:172.18.0.108:0/0 -> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] Feb 20 04:41:20 localhost ceph-mon[288586]: starting mon.np0005625204 rank -1 at public addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] at bind addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005625204 fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(???) e0 preinit fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing) e3 sync_obtain_latest_monmap Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing) e3 sync_obtain_latest_monmap obtained monmap e3 Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).mds e17 new map Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01116#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-20T07:58:28.398421+0000#012modified#0112026-02-20T09:40:14.722031+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01183#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26854}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26854 members: 26854#012[mds.mds.np0005625203.zsrwgk{0:26854} state up:active seq 13 addr [v2:172.18.0.107:6808/3334119751,v1:172.18.0.107:6809/3334119751] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005625202.akhmop{-1:17124} state up:standby seq 1 addr [v2:172.18.0.106:6808/3865978972,v1:172.18.0.106:6809/3865978972] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005625204.wnsphl{-1:26848} state up:standby seq 1 addr [v2:172.18.0.108:6808/2508223371,v1:172.18.0.108:6809/2508223371] compat {c=[1],r=[1],i=[17ff]}] Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mgr to host np0005625202.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 3314933000852226048, adjusting msgr requires Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mgr to host np0005625203.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mgr to host np0005625204.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Saving service mgr spec with placement label:mgr Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 20 04:41:20 localhost ceph-mon[288586]: Deploying daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 20 04:41:20 localhost ceph-mon[288586]: Deploying daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mon to host np0005625199.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label _admin to host np0005625199.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 20 04:41:20 localhost ceph-mon[288586]: Deploying daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mon to host np0005625200.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label _admin to host np0005625200.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mon to host np0005625201.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label _admin to host np0005625201.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mon to host np0005625202.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: Added label _admin to host np0005625202.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mon to host np0005625203.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label _admin to host np0005625203.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label mon to host np0005625204.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Added label _admin to host np0005625204.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:20 localhost ceph-mon[288586]: Saving service mon spec with placement label:mon Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:20 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:20 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:41:20 localhost ceph-mon[288586]: Deploying daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:41:20 localhost ceph-mon[288586]: mon.np0005625204@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Feb 20 04:41:20 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd1e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 20 04:41:22 localhost ceph-mon[288586]: mon.np0005625204@-1(probing) e4 my rank is now 3 (was -1) Feb 20 04:41:22 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:41:22 localhost ceph-mon[288586]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 Feb 20 04:41:22 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e4 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:22 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Feb 20 04:41:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:41:23 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Feb 20 04:41:23 localhost systemd[1]: tmp-crun.vgef3v.mount: Deactivated successfully. Feb 20 04:41:23 localhost podman[288625]: 2026-02-20 09:41:23.146786715 +0000 UTC m=+0.081701855 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 20 04:41:23 localhost podman[288625]: 2026-02-20 09:41:23.157377165 +0000 UTC m=+0.092292305 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 20 04:41:23 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:41:24 localhost nova_compute[281288]: 2026-02-20 09:41:24.895 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:25 localhost nova_compute[281288]: 2026-02-20 09:41:25.031 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e4 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e4 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:25 localhost ceph-mon[288586]: mgrc update_daemon_metadata mon.np0005625204 metadata {addrs=[v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005625204.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005625204.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625199 calling monitor election Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625199 is new leader, mons np0005625199,np0005625201,np0005625200,np0005625204 in quorum (ranks 0,1,2,3) Feb 20 04:41:25 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:41:25 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:25 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:25 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e4 handle_auth_request failed to assign global_id Feb 20 04:41:26 localhost openstack_network_exporter[244414]: ERROR 09:41:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:41:26 localhost openstack_network_exporter[244414]: Feb 20 04:41:26 localhost openstack_network_exporter[244414]: ERROR 09:41:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:41:26 localhost openstack_network_exporter[244414]: Feb 20 04:41:26 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:26 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:41:26 localhost ceph-mon[288586]: Deploying daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:41:27 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e4 adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints Feb 20 04:41:27 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fcf20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 20 04:41:27 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:41:27 localhost ceph-mon[288586]: paxos.3).electionLogic(18) init, last seen epoch 18 Feb 20 04:41:27 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:27 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:28 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Feb 20 04:41:28 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Feb 20 04:41:28 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Feb 20 04:41:29 localhost nova_compute[281288]: 2026-02-20 09:41:29.897 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:30 localhost nova_compute[281288]: 2026-02-20 09:41:30.035 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:30 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Feb 20 04:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:41:30 localhost systemd[1]: tmp-crun.HClEgM.mount: Deactivated successfully. Feb 20 04:41:30 localhost podman[288644]: 2026-02-20 09:41:30.886601676 +0000 UTC m=+0.096518597 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:41:30 localhost podman[288644]: 2026-02-20 09:41:30.898043602 +0000 UTC m=+0.107960533 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:41:30 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625199 calling monitor election Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625199 is new leader, mons np0005625199,np0005625201,np0005625200,np0005625204 in quorum (ranks 0,1,2,3) Feb 20 04:41:32 localhost ceph-mon[288586]: Health check failed: 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204 (MON_DOWN) Feb 20 04:41:32 localhost ceph-mon[288586]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204 Feb 20 04:41:32 localhost ceph-mon[288586]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204 Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625203 (rank 4) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Feb 20 04:41:32 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:32 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Feb 20 04:41:32 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 20 04:41:32 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:41:32 localhost ceph-mon[288586]: paxos.3).electionLogic(20) init, last seen epoch 20 Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:32 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:34 localhost podman[288794]: 2026-02-20 09:41:34.403179072 +0000 UTC m=+0.093383189 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True) Feb 20 04:41:34 localhost podman[288794]: 2026-02-20 09:41:34.515077917 +0000 UTC m=+0.205282064 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, release=1770267347, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:41:34 localhost nova_compute[281288]: 2026-02-20 09:41:34.901 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:35 localhost nova_compute[281288]: 2026-02-20 09:41:35.037 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:41:37 localhost systemd[1]: tmp-crun.QlJYzu.mount: Deactivated successfully. Feb 20 04:41:37 localhost podman[288918]: 2026-02-20 09:41:37.145387953 +0000 UTC m=+0.080817428 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:41:37 localhost podman[288918]: 2026-02-20 09:41:37.162999991 +0000 UTC m=+0.098429456 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:41:37 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625204@3(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:37 localhost ceph-mds[284061]: mds.beacon.mds.np0005625204.wnsphl missed beacon ack from the monitors Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625199 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:41:37 localhost ceph-mon[288586]: mon.np0005625199 is new leader, mons np0005625199,np0005625201,np0005625200,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3,4,5) Feb 20 04:41:37 localhost ceph-mon[288586]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204) Feb 20 04:41:37 localhost ceph-mon[288586]: Cluster is now healthy Feb 20 04:41:37 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:41:37 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:37 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:39 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:39 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:39 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:39 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:39 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:39 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:39 localhost ceph-mon[288586]: Updating np0005625199.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:39 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:39 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:39 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:39 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:39 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:39 localhost nova_compute[281288]: 2026-02-20 09:41:39.942 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:40 localhost nova_compute[281288]: 2026-02-20 09:41:40.040 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:40 localhost ceph-mon[288586]: Updating np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:40 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:40 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:40 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:40 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:41:40 localhost podman[289279]: 2026-02-20 09:41:40.733569279 +0000 UTC m=+0.084356738 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, release=1770267347, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 20 04:41:40 localhost podman[289279]: 2026-02-20 09:41:40.751085234 +0000 UTC m=+0.101872703 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:41:40 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:41:41 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:41 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:41 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:41 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:41 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:41:42 localhost ceph-mon[288586]: Reconfiguring mon.np0005625199 (monmap changed)... Feb 20 04:41:42 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625199 on np0005625199.localdomain Feb 20 04:41:42 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:43 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:43 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625199.ileebh (monmap changed)... Feb 20 04:41:43 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625199.ileebh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:43 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625199.ileebh on np0005625199.localdomain Feb 20 04:41:43 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:43 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:41:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:41:44 localhost podman[289299]: 2026-02-20 09:41:44.168429111 +0000 UTC m=+0.096346862 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:41:44 localhost systemd[1]: tmp-crun.92BqaX.mount: Deactivated successfully. Feb 20 04:41:44 localhost podman[289299]: 2026-02-20 09:41:44.241990812 +0000 UTC m=+0.169908553 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:41:44 localhost podman[289300]: 2026-02-20 09:41:44.242069184 +0000 UTC m=+0.170317225 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:41:44 localhost podman[289300]: 2026-02-20 09:41:44.271301134 +0000 UTC m=+0.199549205 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:41:44 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:41:44 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:41:44 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:44 localhost ceph-mon[288586]: Reconfiguring crash.np0005625199 (monmap changed)... Feb 20 04:41:44 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625199", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:41:44 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625199 on np0005625199.localdomain Feb 20 04:41:44 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:44 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:44 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:41:44 localhost nova_compute[281288]: 2026-02-20 09:41:44.945 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.042 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.743 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:41:45 localhost nova_compute[281288]: 2026-02-20 09:41:45.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:41:45 localhost ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)... Feb 20 04:41:45 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain Feb 20 04:41:45 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:45 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' Feb 20 04:41:45 localhost ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:41:46 localhost ceph-mon[288586]: mon.np0005625204@3(peon).osd e84 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Feb 20 04:41:46 localhost ceph-mon[288586]: mon.np0005625204@3(peon).osd e84 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Feb 20 04:41:46 localhost ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 e85: 6 total, 6 up, 6 in Feb 20 04:41:46 localhost systemd[1]: session-20.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd-logind[759]: Session 20 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd[1]: session-26.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-24.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-17.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-27.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-27.scope: Consumed 3min 23.356s CPU time. Feb 20 04:41:46 localhost systemd-logind[759]: Session 27 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Session 17 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Session 26 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Session 24 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 20. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 26. Feb 20 04:41:46 localhost systemd[1]: session-18.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd-logind[759]: Session 18 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd[1]: session-23.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-19.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-25.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd-logind[759]: Session 23 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd[1]: session-22.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd[1]: session-21.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd-logind[759]: Session 25 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd[1]: session-15.scope: Deactivated successfully. Feb 20 04:41:46 localhost systemd-logind[759]: Session 19 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Session 21 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Session 22 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Session 15 logged out. Waiting for processes to exit. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 24. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 17. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 27. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 18. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 23. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 19. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 25. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 22. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 21. Feb 20 04:41:46 localhost systemd-logind[759]: Removed session 15. Feb 20 04:41:46 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:41:46 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/191250644' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.296 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:41:46 localhost sshd[289362]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:41:46 localhost systemd-logind[759]: New session 65 of user ceph-admin. Feb 20 04:41:46 localhost systemd[1]: Started Session 65 of User ceph-admin. Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.485 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.486 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.790 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.792 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11851MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.792 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.793 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.874 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.875 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.875 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:41:46 localhost nova_compute[281288]: 2026-02-20 09:41:46.915 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:41:46 localhost ceph-mon[288586]: from='client.? 172.18.0.103:0/2662030267' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:41:46 localhost ceph-mon[288586]: Activating manager daemon np0005625201.mtnyvu Feb 20 04:41:46 localhost ceph-mon[288586]: from='client.? 172.18.0.103:0/2662030267' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 20 04:41:46 localhost ceph-mon[288586]: Manager daemon np0005625201.mtnyvu is now available Feb 20 04:41:46 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/mirror_snapshot_schedule"} : dispatch Feb 20 04:41:46 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/mirror_snapshot_schedule"} : dispatch Feb 20 04:41:46 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/trash_purge_schedule"} : dispatch Feb 20 04:41:46 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/trash_purge_schedule"} : dispatch Feb 20 04:41:47 localhost nova_compute[281288]: 2026-02-20 09:41:47.405 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:41:47 localhost nova_compute[281288]: 2026-02-20 09:41:47.412 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:41:47 localhost nova_compute[281288]: 2026-02-20 09:41:47.439 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:41:47 localhost nova_compute[281288]: 2026-02-20 09:41:47.442 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:41:47 localhost nova_compute[281288]: 2026-02-20 09:41:47.442 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:41:47 localhost podman[289494]: 2026-02-20 09:41:47.600807314 +0000 UTC m=+0.103027690 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1770267347, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2) Feb 20 04:41:47 localhost podman[241968]: time="2026-02-20T09:41:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:41:47 localhost podman[241968]: @ - - [20/Feb/2026:09:41:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:41:47 localhost podman[289494]: 2026-02-20 09:41:47.777836318 +0000 UTC m=+0.280056733 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=1770267347, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:41:47 localhost podman[241968]: @ - - [20/Feb/2026:09:41:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18258 "" "Go-http-client/1.1" Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.441 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.442 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.740 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.740 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.961 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.962 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.963 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:41:48 localhost nova_compute[281288]: 2026-02-20 09:41:48.963 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:41:49 localhost ceph-mon[288586]: [20/Feb/2026:09:41:47] ENGINE Bus STARTING Feb 20 04:41:49 localhost ceph-mon[288586]: [20/Feb/2026:09:41:47] ENGINE Serving on http://172.18.0.105:8765 Feb 20 04:41:49 localhost ceph-mon[288586]: [20/Feb/2026:09:41:48] ENGINE Serving on https://172.18.0.105:7150 Feb 20 04:41:49 localhost ceph-mon[288586]: [20/Feb/2026:09:41:48] ENGINE Bus STARTED Feb 20 04:41:49 localhost ceph-mon[288586]: [20/Feb/2026:09:41:48] ENGINE Client ('172.18.0.105', 35862) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:49 localhost nova_compute[281288]: 2026-02-20 09:41:49.416 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:41:49 localhost nova_compute[281288]: 2026-02-20 09:41:49.429 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:41:49 localhost nova_compute[281288]: 2026-02-20 09:41:49.430 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:41:49 localhost nova_compute[281288]: 2026-02-20 09:41:49.430 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:49 localhost nova_compute[281288]: 2026-02-20 09:41:49.431 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:41:49 localhost nova_compute[281288]: 2026-02-20 09:41:49.949 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:50 localhost nova_compute[281288]: 2026-02-20 09:41:50.043 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:50 localhost ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 _set_new_cache_sizes cache_size:1019574915 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:41:50 localhost nova_compute[281288]: 2026-02-20 09:41:50.430 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625199", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625199", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:41:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:41:51 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:41:51 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:41:51 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:41:51 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:41:51 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:41:51 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625199.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:51 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:41:52 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:52 localhost ceph-mon[288586]: Updating np0005625199.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:52 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:52 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:52 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:52 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:41:52 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:52 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:52 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:41:53 localhost podman[290394]: 2026-02-20 09:41:53.540874605 +0000 UTC m=+0.079930160 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:41:53 localhost podman[290394]: 2026-02-20 09:41:53.555965045 +0000 UTC m=+0.095020570 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 20 04:41:53 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:41:53 localhost ceph-mon[288586]: Updating np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:53 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:53 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:53 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:53 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:53 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:53 localhost ceph-mon[288586]: Reconfiguring mon.np0005625200 (monmap changed)... Feb 20 04:41:53 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:41:53 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625200 on np0005625200.localdomain Feb 20 04:41:54 localhost nova_compute[281288]: 2026-02-20 09:41:54.989 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:55 localhost nova_compute[281288]: 2026-02-20 09:41:55.044 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:41:55 localhost ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 _set_new_cache_sizes cache_size:1020043081 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:55 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)... Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:55 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:55 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:41:56 localhost ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)... Feb 20 04:41:56 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:41:56 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:56 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:56 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:56 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:41:56 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:56 localhost openstack_network_exporter[244414]: ERROR 09:41:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:41:56 localhost openstack_network_exporter[244414]: Feb 20 04:41:56 localhost openstack_network_exporter[244414]: ERROR 09:41:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:41:56 localhost openstack_network_exporter[244414]: Feb 20 04:41:57 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)... Feb 20 04:41:57 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain Feb 20 04:41:57 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:57 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:57 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:41:57 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:41:58 localhost ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)... Feb 20 04:41:58 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain Feb 20 04:41:58 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:58 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:58 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:41:58 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:41:59 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:41:59 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:41:59 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:59 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:41:59 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:42:00 localhost nova_compute[281288]: 2026-02-20 09:42:00.027 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:00 localhost nova_compute[281288]: 2026-02-20 09:42:00.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:00 localhost ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054453 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:00 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:42:00 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:42:00 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:00 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:00 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:42:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:42:01 localhost systemd[1]: tmp-crun.grmIZs.mount: Deactivated successfully. Feb 20 04:42:01 localhost podman[290413]: 2026-02-20 09:42:01.140902736 +0000 UTC m=+0.078468344 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:42:01 localhost podman[290413]: 2026-02-20 09:42:01.176087381 +0000 UTC m=+0.113652969 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:42:01 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:42:01 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:42:01 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:42:01 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:01 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:01 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:01 localhost ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:02 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 20 04:42:02 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0 Feb 20 04:42:02 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0 Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625204@3(peon) e7 my rank is now 2 (was 3) Feb 20 04:42:02 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:42:02 localhost ceph-mon[288586]: paxos.2).electionLogic(24) init, last seen epoch 24 Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:02 localhost ceph-mgr[287186]: --2- 172.18.0.108:0/150842184 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x562e2f9b5800 0x562e2f9b6b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Feb 20 04:42:02 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Feb 20 04:42:02 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Feb 20 04:42:02 localhost ceph-mds[284061]: --2- [v2:172.18.0.108:6808/2508223371,v1:172.18.0.108:6809/2508223371] >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x55bd6fd77400 0x55bd6eff5180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Feb 20 04:42:02 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fb84000 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:02 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:42:02 localhost ceph-mon[288586]: Remove daemons mon.np0005625199 Feb 20 04:42:02 localhost ceph-mon[288586]: Safe to remove mon.np0005625199: new quorum should be ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203', 'np0005625202'] (from ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203', 'np0005625202']) Feb 20 04:42:02 localhost ceph-mon[288586]: Removing monitor np0005625199 from monmap... Feb 20 04:42:02 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon rm", "name": "np0005625199"} : dispatch Feb 20 04:42:02 localhost ceph-mon[288586]: Removing daemon mon.np0005625199 from np0005625199.localdomain -- ports [] Feb 20 04:42:02 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:42:02 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:02 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:42:02 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3,4) Feb 20 04:42:02 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:42:03 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:42:03 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:03 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:03 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:42:03 localhost sshd[290437]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:42:04 localhost ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:42:04 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:42:04 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:04 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:04 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:05 localhost nova_compute[281288]: 2026-02-20 09:42:05.029 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:05 localhost nova_compute[281288]: 2026-02-20 09:42:05.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:05 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054725 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:05 localhost ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:42:05 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:42:05 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:05 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:05 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:42:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:42:06.004 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:42:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:42:06.005 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:42:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:42:06.006 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:42:06 localhost ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:42:06 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:42:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:06 localhost ceph-mon[288586]: Removed label mon from host np0005625199.localdomain Feb 20 04:42:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:42:07 localhost ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:42:07 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:42:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:42:08 localhost podman[290439]: 2026-02-20 09:42:08.136417216 +0000 UTC m=+0.074838695 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:42:08 localhost podman[290439]: 2026-02-20 09:42:08.171983402 +0000 UTC m=+0.110404911 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:42:08 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:42:08 localhost ceph-mon[288586]: Removed label mgr from host np0005625199.localdomain Feb 20 04:42:08 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:42:08 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:42:08 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:08 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:08 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:08 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:09 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:42:09 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:42:09 localhost ceph-mon[288586]: Removed label _admin from host np0005625199.localdomain Feb 20 04:42:09 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:09 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:09 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:42:10 localhost nova_compute[281288]: 2026-02-20 09:42:10.049 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:10 localhost nova_compute[281288]: 2026-02-20 09:42:10.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:10 localhost nova_compute[281288]: 2026-02-20 09:42:10.051 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:42:10 localhost nova_compute[281288]: 2026-02-20 09:42:10.051 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:10 localhost nova_compute[281288]: 2026-02-20 09:42:10.085 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:10 localhost nova_compute[281288]: 2026-02-20 09:42:10.086 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:10 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:10 localhost ceph-mon[288586]: Reconfiguring mon.np0005625203 (monmap changed)... Feb 20 04:42:10 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain Feb 20 04:42:10 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:10 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:10 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:10 localhost podman[290517]: Feb 20 04:42:10 localhost podman[290517]: 2026-02-20 09:42:10.724577768 +0000 UTC m=+0.077380241 container create 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, name=rhceph, build-date=2026-02-09T10:25:24Z, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:42:10 localhost systemd[1]: Started libpod-conmon-199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6.scope. Feb 20 04:42:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:42:10 localhost systemd[1]: Started libcrun container. Feb 20 04:42:10 localhost podman[290517]: 2026-02-20 09:42:10.694587269 +0000 UTC m=+0.047389762 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:10 localhost podman[290517]: 2026-02-20 09:42:10.80236089 +0000 UTC m=+0.155163363 container init 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:42:10 localhost systemd[1]: tmp-crun.iQ5OwI.mount: Deactivated successfully. Feb 20 04:42:10 localhost podman[290517]: 2026-02-20 09:42:10.821201975 +0000 UTC m=+0.174004488 container start 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Feb 20 04:42:10 localhost keen_villani[290532]: 167 167 Feb 20 04:42:10 localhost podman[290517]: 2026-02-20 09:42:10.821754421 +0000 UTC m=+0.174556934 container attach 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, release=1770267347, io.openshift.expose-services=, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux ) Feb 20 04:42:10 localhost systemd[1]: libpod-199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6.scope: Deactivated successfully. Feb 20 04:42:10 localhost podman[290517]: 2026-02-20 09:42:10.82771162 +0000 UTC m=+0.180514073 container died 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:42:10 localhost podman[290547]: 2026-02-20 09:42:10.903655517 +0000 UTC m=+0.068448933 container remove 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.42.2, version=7) Feb 20 04:42:10 localhost systemd[1]: libpod-conmon-199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6.scope: Deactivated successfully. Feb 20 04:42:10 localhost podman[290533]: 2026-02-20 09:42:10.876429211 +0000 UTC m=+0.092554406 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public) Feb 20 04:42:10 localhost podman[290533]: 2026-02-20 09:42:10.95910626 +0000 UTC m=+0.175231475 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:42:10 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:42:11 localhost podman[290628]: Feb 20 04:42:11 localhost podman[290628]: 2026-02-20 09:42:11.577201993 +0000 UTC m=+0.078423193 container create 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2026-02-09T10:25:24Z) Feb 20 04:42:11 localhost systemd[1]: Started libpod-conmon-29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615.scope. Feb 20 04:42:11 localhost systemd[1]: Started libcrun container. Feb 20 04:42:11 localhost podman[290628]: 2026-02-20 09:42:11.636181011 +0000 UTC m=+0.137401961 container init 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-type=git, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:42:11 localhost podman[290628]: 2026-02-20 09:42:11.643349085 +0000 UTC m=+0.144570065 container start 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, name=rhceph, release=1770267347, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:42:11 localhost podman[290628]: 2026-02-20 09:42:11.643590922 +0000 UTC m=+0.144811872 container attach 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public) Feb 20 04:42:11 localhost tender_morse[290643]: 167 167 Feb 20 04:42:11 localhost podman[290628]: 2026-02-20 09:42:11.545955246 +0000 UTC m=+0.047176256 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:11 localhost systemd[1]: libpod-29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615.scope: Deactivated successfully. Feb 20 04:42:11 localhost podman[290628]: 2026-02-20 09:42:11.646121308 +0000 UTC m=+0.147342258 container died 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=1770267347, ceph=True, maintainer=Guillaume Abrioux ) Feb 20 04:42:11 localhost ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:42:11 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:42:11 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:11 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:11 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:42:11 localhost systemd[1]: var-lib-containers-storage-overlay-2a7c64a96c40c35b21abaeb1833ec8a76641ce9852319b6ed0e4e3925b0b93f6-merged.mount: Deactivated successfully. Feb 20 04:42:11 localhost podman[290649]: 2026-02-20 09:42:11.732820028 +0000 UTC m=+0.073020630 container remove 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph) Feb 20 04:42:11 localhost systemd[1]: libpod-conmon-29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615.scope: Deactivated successfully. Feb 20 04:42:12 localhost podman[290725]: Feb 20 04:42:12 localhost podman[290725]: 2026-02-20 09:42:12.527784544 +0000 UTC m=+0.077584498 container create 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=1770267347, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:12 localhost systemd[1]: Started libpod-conmon-0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12.scope. Feb 20 04:42:12 localhost systemd[1]: Started libcrun container. Feb 20 04:42:12 localhost podman[290725]: 2026-02-20 09:42:12.497964069 +0000 UTC m=+0.047764043 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:12 localhost podman[290725]: 2026-02-20 09:42:12.599650959 +0000 UTC m=+0.149450903 container init 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1770267347, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=7, distribution-scope=public, ceph=True, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:42:12 localhost nostalgic_kowalevski[290740]: 167 167 Feb 20 04:42:12 localhost podman[290725]: 2026-02-20 09:42:12.61773501 +0000 UTC m=+0.167534954 container start 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, name=rhceph, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:12 localhost podman[290725]: 2026-02-20 09:42:12.618096941 +0000 UTC m=+0.167896925 container attach 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , name=rhceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:12 localhost systemd[1]: libpod-0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12.scope: Deactivated successfully. Feb 20 04:42:12 localhost podman[290725]: 2026-02-20 09:42:12.620993278 +0000 UTC m=+0.170793272 container died 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=) Feb 20 04:42:12 localhost ceph-mon[288586]: Reconfiguring osd.0 (monmap changed)... Feb 20 04:42:12 localhost ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:42:12 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:12 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:12 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:42:12 localhost podman[290745]: 2026-02-20 09:42:12.723677587 +0000 UTC m=+0.090529035 container remove 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, name=rhceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:42:12 localhost systemd[1]: var-lib-containers-storage-overlay-fe4148f1cdf25b963e05d78801b63ba5e781225cb31f3535550571ccb6b78f7e-merged.mount: Deactivated successfully. Feb 20 04:42:12 localhost systemd[1]: libpod-conmon-0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12.scope: Deactivated successfully. Feb 20 04:42:13 localhost sshd[290832]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:42:13 localhost podman[290822]: Feb 20 04:42:13 localhost podman[290822]: 2026-02-20 09:42:13.602047954 +0000 UTC m=+0.074436713 container create b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.expose-services=, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:42:13 localhost systemd[1]: Started libpod-conmon-b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1.scope. Feb 20 04:42:13 localhost systemd[1]: Started libcrun container. Feb 20 04:42:13 localhost podman[290822]: 2026-02-20 09:42:13.566725624 +0000 UTC m=+0.039114443 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:13 localhost podman[290822]: 2026-02-20 09:42:13.675600069 +0000 UTC m=+0.147988838 container init b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, vcs-type=git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, distribution-scope=public, GIT_BRANCH=main) Feb 20 04:42:13 localhost podman[290822]: 2026-02-20 09:42:13.68431863 +0000 UTC m=+0.156707399 container start b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:13 localhost podman[290822]: 2026-02-20 09:42:13.685779153 +0000 UTC m=+0.158167922 container attach b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:42:13 localhost systemd[1]: libpod-b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1.scope: Deactivated successfully. Feb 20 04:42:13 localhost friendly_stonebraker[290839]: 167 167 Feb 20 04:42:13 localhost podman[290822]: 2026-02-20 09:42:13.689471365 +0000 UTC m=+0.161860164 container died b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, version=7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:42:13 localhost ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)... Feb 20 04:42:13 localhost ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:42:13 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:13 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:13 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:13 localhost systemd[1]: tmp-crun.iV7Lo9.mount: Deactivated successfully. Feb 20 04:42:13 localhost systemd[1]: var-lib-containers-storage-overlay-3c74b7653ffc47217de9b755a70dd9a5367d950122ff4c38a60a68c00bc2351f-merged.mount: Deactivated successfully. Feb 20 04:42:13 localhost podman[290844]: 2026-02-20 09:42:13.814549565 +0000 UTC m=+0.115162834 container remove b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7) Feb 20 04:42:13 localhost systemd[1]: libpod-conmon-b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1.scope: Deactivated successfully. Feb 20 04:42:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:42:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:42:14 localhost podman[290913]: 2026-02-20 09:42:14.528684377 +0000 UTC m=+0.087619308 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 20 04:42:14 localhost podman[290913]: 2026-02-20 09:42:14.565029506 +0000 UTC m=+0.123964467 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:42:14 localhost podman[290912]: 2026-02-20 09:42:14.581299874 +0000 UTC m=+0.140025029 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:42:14 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:42:14 localhost podman[290926]: Feb 20 04:42:14 localhost podman[290926]: 2026-02-20 09:42:14.65120048 +0000 UTC m=+0.184276326 container create 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.42.2, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Feb 20 04:42:14 localhost podman[290912]: 2026-02-20 09:42:14.696067116 +0000 UTC m=+0.254792231 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:42:14 localhost systemd[1]: Started libpod-conmon-5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441.scope. Feb 20 04:42:14 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:42:14 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:42:14 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:14 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:14 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:14 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:42:14 localhost podman[290926]: 2026-02-20 09:42:14.614212121 +0000 UTC m=+0.147287977 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:14 localhost systemd[1]: Started libcrun container. Feb 20 04:42:14 localhost podman[290926]: 2026-02-20 09:42:14.731047245 +0000 UTC m=+0.264123091 container init 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, distribution-scope=public, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=) Feb 20 04:42:14 localhost podman[290926]: 2026-02-20 09:42:14.740803767 +0000 UTC m=+0.273879613 container start 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, release=1770267347, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:42:14 localhost podman[290926]: 2026-02-20 09:42:14.741088645 +0000 UTC m=+0.274164491 container attach 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, release=1770267347, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:42:14 localhost sad_saha[290973]: 167 167 Feb 20 04:42:14 localhost systemd[1]: libpod-5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441.scope: Deactivated successfully. Feb 20 04:42:14 localhost podman[290926]: 2026-02-20 09:42:14.745597531 +0000 UTC m=+0.278673417 container died 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, version=7, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public) Feb 20 04:42:14 localhost systemd[1]: var-lib-containers-storage-overlay-d91c343a0d58b4f0a87f394aaf88111a173056b50935e0e0697880220cf8ee5e-merged.mount: Deactivated successfully. Feb 20 04:42:14 localhost podman[290978]: 2026-02-20 09:42:14.846782035 +0000 UTC m=+0.088629239 container remove 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:14 localhost systemd[1]: libpod-conmon-5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441.scope: Deactivated successfully. Feb 20 04:42:15 localhost nova_compute[281288]: 2026-02-20 09:42:15.087 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:15 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:15 localhost podman[291048]: Feb 20 04:42:15 localhost podman[291048]: 2026-02-20 09:42:15.60338032 +0000 UTC m=+0.078955488 container create 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main) Feb 20 04:42:15 localhost systemd[1]: Started libpod-conmon-3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc.scope. Feb 20 04:42:15 localhost systemd[1]: Started libcrun container. Feb 20 04:42:15 localhost podman[291048]: 2026-02-20 09:42:15.669272685 +0000 UTC m=+0.144847833 container init 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, GIT_CLEAN=True) Feb 20 04:42:15 localhost podman[291048]: 2026-02-20 09:42:15.571822684 +0000 UTC m=+0.047397882 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:15 localhost podman[291048]: 2026-02-20 09:42:15.679002227 +0000 UTC m=+0.154577375 container start 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_CLEAN=True, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:15 localhost podman[291048]: 2026-02-20 09:42:15.679220914 +0000 UTC m=+0.154796072 container attach 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, architecture=x86_64, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1770267347, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public) Feb 20 04:42:15 localhost magical_snyder[291063]: 167 167 Feb 20 04:42:15 localhost systemd[1]: libpod-3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc.scope: Deactivated successfully. Feb 20 04:42:15 localhost podman[291048]: 2026-02-20 09:42:15.683157301 +0000 UTC m=+0.158732449 container died 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, ceph=True, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_CLEAN=True, io.buildah.version=1.42.2, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 20 04:42:15 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:42:15 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:42:15 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:15 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:15 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:42:15 localhost podman[291068]: 2026-02-20 09:42:15.777594613 +0000 UTC m=+0.085859456 container remove 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.expose-services=, release=1770267347, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vendor=Red Hat, Inc.) Feb 20 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-ea52fc9008f4e689bde064e32d6a152515a5859ef1e2fd6ab06df578c787ca01-merged.mount: Deactivated successfully. Feb 20 04:42:15 localhost systemd[1]: libpod-conmon-3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc.scope: Deactivated successfully. Feb 20 04:42:16 localhost ceph-mon[288586]: Reconfiguring mon.np0005625204 (monmap changed)... Feb 20 04:42:16 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:42:16 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:16 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:16 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:16 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:16 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:16 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:17 localhost podman[241968]: time="2026-02-20T09:42:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:42:17 localhost podman[241968]: @ - - [20/Feb/2026:09:42:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:42:17 localhost podman[241968]: @ - - [20/Feb/2026:09:42:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18254 "" "Go-http-client/1.1" Feb 20 04:42:18 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:18 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:18 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:42:18 localhost ceph-mon[288586]: Removing np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Removing np0005625199.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:42:18 localhost ceph-mon[288586]: Removing np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:42:18 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:18 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:42:18 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:42:20 localhost nova_compute[281288]: 2026-02-20 09:42:20.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:20 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: Removing daemon mgr.np0005625199.ileebh from np0005625199.localdomain -- ports [9283, 8765] Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: Added label _no_schedule to host np0005625199.localdomain Feb 20 04:42:20 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:20 localhost ceph-mon[288586]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625199.localdomain Feb 20 04:42:22 localhost ceph-mon[288586]: Removing key for mgr.np0005625199.ileebh Feb 20 04:42:22 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth rm", "entity": "mgr.np0005625199.ileebh"} : dispatch Feb 20 04:42:22 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005625199.ileebh"}]': finished Feb 20 04:42:22 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:22 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:22 localhost sshd[291422]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:42:23 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:23 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain"} : dispatch Feb 20 04:42:23 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain"}]': finished Feb 20 04:42:23 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:42:23 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:42:24 localhost systemd[1]: tmp-crun.R2JFpO.mount: Deactivated successfully. Feb 20 04:42:24 localhost podman[291442]: 2026-02-20 09:42:24.095024977 +0000 UTC m=+0.090204345 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 20 04:42:24 localhost podman[291442]: 2026-02-20 09:42:24.11011116 +0000 UTC m=+0.105290528 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:42:24 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:42:24 localhost ceph-mon[288586]: Removed host np0005625199.localdomain Feb 20 04:42:24 localhost ceph-mon[288586]: host np0005625199.localdomain `cephadm ls` failed: Cannot decode JSON: #012Traceback (most recent call last):#012 File "/usr/share/ceph/mgr/cephadm/serve.py", line 1540, in _run_cephadm_json#012 return json.loads(''.join(out))#012 File "/lib64/python3.9/json/__init__.py", line 346, in loads#012 return _default_decoder.decode(s)#012 File "/lib64/python3.9/json/decoder.py", line 337, in decode#012 obj, end = self.raw_decode(s, idx=_w(s, 0).end())#012 File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode#012 raise JSONDecodeError("Expecting value", s, err.value) from None#012json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) Feb 20 04:42:24 localhost ceph-mon[288586]: executing refresh((['np0005625199.localdomain', 'np0005625200.localdomain', 'np0005625201.localdomain', 'np0005625202.localdomain', 'np0005625203.localdomain', 'np0005625204.localdomain'],)) failed.#012Traceback (most recent call last):#012 File "/usr/share/ceph/mgr/cephadm/utils.py", line 94, in do_work#012 return f(*arg)#012 File "/usr/share/ceph/mgr/cephadm/serve.py", line 317, in refresh#012 and not self.mgr.inventory.has_label(host, SpecialHostLabels.NO_MEMORY_AUTOTUNE)#012 File "/usr/share/ceph/mgr/cephadm/inventory.py", line 253, in has_label#012 host = self._get_stored_name(host)#012 File "/usr/share/ceph/mgr/cephadm/inventory.py", line 181, in _get_stored_name#012 self.assert_host(host)#012 File "/usr/share/ceph/mgr/cephadm/inventory.py", line 209, in assert_host#012 raise OrchestratorError('host %s does not exist' % host)#012orchestrator._interface.OrchestratorError: host np0005625199.localdomain does not exist Feb 20 04:42:24 localhost ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)... Feb 20 04:42:24 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:24 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain Feb 20 04:42:25 localhost nova_compute[281288]: 2026-02-20 09:42:25.091 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:25 localhost nova_compute[281288]: 2026-02-20 09:42:25.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:25 localhost nova_compute[281288]: 2026-02-20 09:42:25.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:42:25 localhost nova_compute[281288]: 2026-02-20 09:42:25.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:25 localhost nova_compute[281288]: 2026-02-20 09:42:25.130 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:25 localhost nova_compute[281288]: 2026-02-20 09:42:25.131 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:25 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:25 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:25 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:25 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:42:26 localhost openstack_network_exporter[244414]: ERROR 09:42:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:42:26 localhost openstack_network_exporter[244414]: Feb 20 04:42:26 localhost openstack_network_exporter[244414]: ERROR 09:42:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:42:26 localhost openstack_network_exporter[244414]: Feb 20 04:42:26 localhost ceph-mon[288586]: Reconfiguring mon.np0005625200 (monmap changed)... Feb 20 04:42:26 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625200 on np0005625200.localdomain Feb 20 04:42:26 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:26 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:26 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:26 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:27 localhost sshd[291462]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:42:27 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)... Feb 20 04:42:27 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain Feb 20 04:42:27 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:27 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:27 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:42:28 localhost ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)... Feb 20 04:42:28 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:42:28 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:28 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:28 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:29 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)... Feb 20 04:42:29 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain Feb 20 04:42:29 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:29 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:29 localhost ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)... Feb 20 04:42:29 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:29 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain Feb 20 04:42:29 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:29 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:29 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:42:29 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:29 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:42:30 localhost nova_compute[281288]: 2026-02-20 09:42:30.132 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:30 localhost nova_compute[281288]: 2026-02-20 09:42:30.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:30 localhost nova_compute[281288]: 2026-02-20 09:42:30.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:42:30 localhost nova_compute[281288]: 2026-02-20 09:42:30.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:30 localhost nova_compute[281288]: 2026-02-20 09:42:30.134 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:30 localhost nova_compute[281288]: 2026-02-20 09:42:30.137 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:30 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:31 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:42:31 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:31 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:42:32 localhost podman[291464]: 2026-02-20 09:42:32.146520229 +0000 UTC m=+0.084937307 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:42:32 localhost podman[291464]: 2026-02-20 09:42:32.154735715 +0000 UTC m=+0.093152783 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:42:32 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:42:32 localhost ceph-mon[288586]: Saving service mon spec with placement label:mon Feb 20 04:42:32 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:42:32 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:42:32 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:32 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:32 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:33 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc38000 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Feb 20 04:42:33 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:42:33 localhost ceph-mon[288586]: paxos.2).electionLogic(26) init, last seen epoch 26 Feb 20 04:42:33 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:35 localhost nova_compute[281288]: 2026-02-20 09:42:35.137 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:38 localhost ceph-mon[288586]: paxos.2).electionLogic(27) init, last seen epoch 27, mid-election, bumping Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:38 localhost ceph-mon[288586]: Remove daemons mon.np0005625202 Feb 20 04:42:38 localhost ceph-mon[288586]: Safe to remove mon.np0005625202: new quorum should be ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203'] (from ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203']) Feb 20 04:42:38 localhost ceph-mon[288586]: Removing monitor np0005625202 from monmap... Feb 20 04:42:38 localhost ceph-mon[288586]: Removing daemon mon.np0005625202 from np0005625202.localdomain -- ports [] Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:42:38 localhost ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)... Feb 20 04:42:38 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625203 in quorum (ranks 0,1,3) Feb 20 04:42:38 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:42:38 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204,np0005625203 in quorum (ranks 0,1,2,3) Feb 20 04:42:38 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:42:38 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:42:39 localhost systemd[1]: tmp-crun.VqS38Z.mount: Deactivated successfully. Feb 20 04:42:39 localhost podman[291506]: 2026-02-20 09:42:39.159171352 +0000 UTC m=+0.093909528 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:42:39 localhost podman[291506]: 2026-02-20 09:42:39.198203391 +0000 UTC m=+0.132941607 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:42:39 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:42:39 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain Feb 20 04:42:39 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:39 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:39 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:40 localhost nova_compute[281288]: 2026-02-20 09:42:40.139 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:40 localhost nova_compute[281288]: 2026-02-20 09:42:40.141 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:40 localhost nova_compute[281288]: 2026-02-20 09:42:40.141 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:42:40 localhost nova_compute[281288]: 2026-02-20 09:42:40.141 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:40 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:40 localhost nova_compute[281288]: 2026-02-20 09:42:40.177 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:40 localhost nova_compute[281288]: 2026-02-20 09:42:40.178 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:40 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)... Feb 20 04:42:40 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain Feb 20 04:42:40 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:42:41 localhost podman[291529]: 2026-02-20 09:42:41.145316932 +0000 UTC m=+0.082765763 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:42:41 localhost podman[291529]: 2026-02-20 09:42:41.165048434 +0000 UTC m=+0.102497255 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., release=1770267347, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc.) Feb 20 04:42:41 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:42:41 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:41 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)... Feb 20 04:42:41 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:41 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain Feb 20 04:42:41 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:42 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:42 localhost ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)... Feb 20 04:42:42 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:42 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain Feb 20 04:42:42 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:42 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:42 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:42:42 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:42 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:42:44 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:44 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:44 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:42:44 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:42:44 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:42:44 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:42:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:42:45 localhost systemd[1]: tmp-crun.mdJ80Z.mount: Deactivated successfully. Feb 20 04:42:45 localhost podman[291550]: 2026-02-20 09:42:45.167258292 +0000 UTC m=+0.096279077 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:42:45 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.179 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.182 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.215 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.216 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:45 localhost systemd[1]: tmp-crun.Kc9oeL.mount: Deactivated successfully. Feb 20 04:42:45 localhost podman[291551]: 2026-02-20 09:42:45.232136877 +0000 UTC m=+0.159004328 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 20 04:42:45 localhost podman[291551]: 2026-02-20 09:42:45.269075635 +0000 UTC m=+0.195943106 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:42:45 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:42:45 localhost podman[291550]: 2026-02-20 09:42:45.287529738 +0000 UTC m=+0.216550593 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:42:45 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:42:45 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:45 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:42:45 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:42:45 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:42:45 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:45 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:45 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.756 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.757 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.757 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.757 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:42:45 localhost nova_compute[281288]: 2026-02-20 09:42:45.758 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:42:46 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:42:46 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3587761072' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.222 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.311 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.311 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.544 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.546 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11855MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.547 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.547 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.610 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.611 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.611 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:42:46 localhost nova_compute[281288]: 2026-02-20 09:42:46.651 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:42:46 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:42:46 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:42:46 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:46 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:46 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:47 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:42:47 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/724040001' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:42:47 localhost nova_compute[281288]: 2026-02-20 09:42:47.098 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:42:47 localhost nova_compute[281288]: 2026-02-20 09:42:47.105 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:42:47 localhost nova_compute[281288]: 2026-02-20 09:42:47.126 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:42:47 localhost nova_compute[281288]: 2026-02-20 09:42:47.128 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:42:47 localhost nova_compute[281288]: 2026-02-20 09:42:47.129 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:42:47 localhost podman[241968]: time="2026-02-20T09:42:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:42:47 localhost podman[241968]: @ - - [20/Feb/2026:09:42:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:42:47 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:42:47 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:42:47 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:47 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:47 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:42:47 localhost podman[241968]: @ - - [20/Feb/2026:09:42:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18262 "" "Go-http-client/1.1" Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.129 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.130 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.130 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.130 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:48 localhost nova_compute[281288]: 2026-02-20 09:42:48.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:42:48 localhost ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:42:48 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:42:48 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:48 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:48 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.720 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.720 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:42:49 localhost ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:42:49 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:42:49 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:49 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:49 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:42:49 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:49 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.970 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.971 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.971 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:42:49 localhost nova_compute[281288]: 2026-02-20 09:42:49.972 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:42:50 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:42:50 localhost nova_compute[281288]: 2026-02-20 09:42:50.216 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:50 localhost nova_compute[281288]: 2026-02-20 09:42:50.219 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:50 localhost nova_compute[281288]: 2026-02-20 09:42:50.293 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:42:50 localhost nova_compute[281288]: 2026-02-20 09:42:50.309 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:42:50 localhost nova_compute[281288]: 2026-02-20 09:42:50.309 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:42:50 localhost ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:42:50 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:42:50 localhost ceph-mon[288586]: Deploying daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:42:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:50 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:42:51 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:42:51 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:42:51 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:51 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:51 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:42:52 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Feb 20 04:42:52 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Feb 20 04:42:52 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Feb 20 04:42:52 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc38160 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Feb 20 04:42:52 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:42:52 localhost ceph-mon[288586]: paxos.2).electionLogic(32) init, last seen epoch 32 Feb 20 04:42:52 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:52 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:52 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:52 localhost podman[291690]: Feb 20 04:42:52 localhost podman[291690]: 2026-02-20 09:42:52.793717759 +0000 UTC m=+0.080156535 container create 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:42:52 localhost systemd[1]: Started libpod-conmon-9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1.scope. Feb 20 04:42:52 localhost systemd[1]: Started libcrun container. Feb 20 04:42:52 localhost podman[291690]: 2026-02-20 09:42:52.761281456 +0000 UTC m=+0.047720232 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:52 localhost podman[291690]: 2026-02-20 09:42:52.870363537 +0000 UTC m=+0.156802313 container init 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Feb 20 04:42:52 localhost podman[291690]: 2026-02-20 09:42:52.884097928 +0000 UTC m=+0.170536684 container start 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=1770267347, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True) Feb 20 04:42:52 localhost podman[291690]: 2026-02-20 09:42:52.884306225 +0000 UTC m=+0.170745051 container attach 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , ceph=True, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:42:52 localhost wonderful_bartik[291705]: 167 167 Feb 20 04:42:52 localhost podman[291690]: 2026-02-20 09:42:52.889987845 +0000 UTC m=+0.176426631 container died 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public) Feb 20 04:42:52 localhost systemd[1]: libpod-9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1.scope: Deactivated successfully. Feb 20 04:42:52 localhost podman[291710]: 2026-02-20 09:42:52.983183149 +0000 UTC m=+0.081196716 container remove 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, ceph=True, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_CLEAN=True, RELEASE=main, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z) Feb 20 04:42:52 localhost systemd[1]: libpod-conmon-9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1.scope: Deactivated successfully. Feb 20 04:42:53 localhost systemd[1]: var-lib-containers-storage-overlay-520da4a85081cd80df64788a7f69a185268d89edac0e4e6baa5016e415001ea9-merged.mount: Deactivated successfully. Feb 20 04:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:42:55 localhost podman[291730]: 2026-02-20 09:42:55.148074059 +0000 UTC m=+0.086046231 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 20 04:42:55 localhost podman[291730]: 2026-02-20 09:42:55.185060388 +0000 UTC m=+0.123032560 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:42:55 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:42:55 localhost nova_compute[281288]: 2026-02-20 09:42:55.220 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:42:55 localhost nova_compute[281288]: 2026-02-20 09:42:55.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:55 localhost nova_compute[281288]: 2026-02-20 09:42:55.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:42:55 localhost nova_compute[281288]: 2026-02-20 09:42:55.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:55 localhost nova_compute[281288]: 2026-02-20 09:42:55.223 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:42:55 localhost nova_compute[281288]: 2026-02-20 09:42:55.225 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:42:56 localhost openstack_network_exporter[244414]: ERROR 09:42:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:42:56 localhost openstack_network_exporter[244414]: Feb 20 04:42:56 localhost openstack_network_exporter[244414]: ERROR 09:42:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:42:56 localhost openstack_network_exporter[244414]: Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:57 localhost ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:42:57 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204 in quorum (ranks 0,1,2) Feb 20 04:42:57 localhost ceph-mon[288586]: Health check failed: 2/5 mons down, quorum np0005625201,np0005625200,np0005625204 (MON_DOWN) Feb 20 04:42:57 localhost ceph-mon[288586]: Health detail: HEALTH_WARN 2/5 mons down, quorum np0005625201,np0005625200,np0005625204 Feb 20 04:42:57 localhost ceph-mon[288586]: [WRN] MON_DOWN: 2/5 mons down, quorum np0005625201,np0005625200,np0005625204 Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625203 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Feb 20 04:42:57 localhost ceph-mon[288586]: mon.np0005625202 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Feb 20 04:42:57 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:57 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:57 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:42:58 localhost podman[291801]: Feb 20 04:42:58 localhost podman[291801]: 2026-02-20 09:42:58.204458779 +0000 UTC m=+0.078249848 container create fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, version=7, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:42:58 localhost systemd[1]: Started libpod-conmon-fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a.scope. Feb 20 04:42:58 localhost systemd[1]: Started libcrun container. Feb 20 04:42:58 localhost podman[291801]: 2026-02-20 09:42:58.174285614 +0000 UTC m=+0.048076683 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:58 localhost podman[291801]: 2026-02-20 09:42:58.282980343 +0000 UTC m=+0.156771412 container init fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:42:58 localhost podman[291801]: 2026-02-20 09:42:58.292247381 +0000 UTC m=+0.166038410 container start fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=1770267347, GIT_BRANCH=main, io.buildah.version=1.42.2) Feb 20 04:42:58 localhost podman[291801]: 2026-02-20 09:42:58.292423996 +0000 UTC m=+0.166215055 container attach fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347) Feb 20 04:42:58 localhost nostalgic_blackburn[291816]: 167 167 Feb 20 04:42:58 localhost systemd[1]: libpod-fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a.scope: Deactivated successfully. Feb 20 04:42:58 localhost podman[291801]: 2026-02-20 09:42:58.299143938 +0000 UTC m=+0.172935017 container died fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, release=1770267347, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux ) Feb 20 04:42:58 localhost podman[291821]: 2026-02-20 09:42:58.396808946 +0000 UTC m=+0.086924527 container remove fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, build-date=2026-02-09T10:25:24Z, ceph=True, com.redhat.component=rhceph-container) Feb 20 04:42:58 localhost systemd[1]: libpod-conmon-fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a.scope: Deactivated successfully. Feb 20 04:42:58 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:42:58 localhost ceph-mon[288586]: paxos.2).electionLogic(35) init, last seen epoch 35, mid-election, bumping Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625200 calling monitor election Feb 20 04:42:58 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3,4) Feb 20 04:42:58 localhost ceph-mon[288586]: Health check cleared: MON_DOWN (was: 2/5 mons down, quorum np0005625201,np0005625200,np0005625204) Feb 20 04:42:58 localhost ceph-mon[288586]: Cluster is now healthy Feb 20 04:42:58 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:42:58 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:59 localhost systemd[1]: var-lib-containers-storage-overlay-d772c0e9da178ffb07cddfe2e1af16f00670a6862566c26671fc13066a1c6e9b-merged.mount: Deactivated successfully. Feb 20 04:42:59 localhost podman[291897]: Feb 20 04:42:59 localhost podman[291897]: 2026-02-20 09:42:59.226312397 +0000 UTC m=+0.068702921 container create c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:42:59 localhost systemd[1]: Started libpod-conmon-c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf.scope. Feb 20 04:42:59 localhost systemd[1]: Started libcrun container. Feb 20 04:42:59 localhost podman[291897]: 2026-02-20 09:42:59.28946598 +0000 UTC m=+0.131856494 container init c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, ceph=True, vendor=Red Hat, Inc., version=7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main) Feb 20 04:42:59 localhost podman[291897]: 2026-02-20 09:42:59.198868984 +0000 UTC m=+0.041259488 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:42:59 localhost adoring_blackwell[291912]: 167 167 Feb 20 04:42:59 localhost systemd[1]: libpod-c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf.scope: Deactivated successfully. Feb 20 04:42:59 localhost podman[291897]: 2026-02-20 09:42:59.303744669 +0000 UTC m=+0.146135193 container start c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:42:59 localhost podman[291897]: 2026-02-20 09:42:59.303982527 +0000 UTC m=+0.146373081 container attach c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:42:59 localhost podman[291897]: 2026-02-20 09:42:59.306230763 +0000 UTC m=+0.148621337 container died c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, release=1770267347, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64) Feb 20 04:42:59 localhost podman[291917]: 2026-02-20 09:42:59.389256193 +0000 UTC m=+0.074717272 container remove c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Feb 20 04:42:59 localhost systemd[1]: libpod-conmon-c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf.scope: Deactivated successfully. Feb 20 04:42:59 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:59 localhost ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)... Feb 20 04:42:59 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:42:59 localhost ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:42:59 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:59 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:42:59 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:43:00 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:00 localhost systemd[1]: tmp-crun.CXnr9P.mount: Deactivated successfully. Feb 20 04:43:00 localhost systemd[1]: var-lib-containers-storage-overlay-3888d7f4117587f36bc7a7537526994b01e2d40b6eca482e4f33aa91a3c8ec33-merged.mount: Deactivated successfully. Feb 20 04:43:00 localhost podman[291994]: Feb 20 04:43:00 localhost nova_compute[281288]: 2026-02-20 09:43:00.227 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:00 localhost nova_compute[281288]: 2026-02-20 09:43:00.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:00 localhost nova_compute[281288]: 2026-02-20 09:43:00.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:00 localhost nova_compute[281288]: 2026-02-20 09:43:00.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:00 localhost podman[291994]: 2026-02-20 09:43:00.240275779 +0000 UTC m=+0.085944738 container create a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, version=7, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container) Feb 20 04:43:00 localhost nova_compute[281288]: 2026-02-20 09:43:00.261 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:00 localhost nova_compute[281288]: 2026-02-20 09:43:00.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:00 localhost systemd[1]: Started libpod-conmon-a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936.scope. Feb 20 04:43:00 localhost systemd[1]: Started libcrun container. Feb 20 04:43:00 localhost podman[291994]: 2026-02-20 09:43:00.200000421 +0000 UTC m=+0.045669410 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:43:00 localhost podman[291994]: 2026-02-20 09:43:00.312707081 +0000 UTC m=+0.158376040 container init a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:43:00 localhost podman[291994]: 2026-02-20 09:43:00.323045571 +0000 UTC m=+0.168714530 container start a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.buildah.version=1.42.2, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:43:00 localhost lucid_hopper[292009]: 167 167 Feb 20 04:43:00 localhost podman[291994]: 2026-02-20 09:43:00.323313958 +0000 UTC m=+0.168982917 container attach a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph) Feb 20 04:43:00 localhost systemd[1]: libpod-a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936.scope: Deactivated successfully. Feb 20 04:43:00 localhost podman[291994]: 2026-02-20 09:43:00.328965598 +0000 UTC m=+0.174634617 container died a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, ceph=True, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main) Feb 20 04:43:00 localhost podman[292014]: 2026-02-20 09:43:00.415796321 +0000 UTC m=+0.077315798 container remove a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:43:00 localhost systemd[1]: libpod-conmon-a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936.scope: Deactivated successfully. Feb 20 04:43:00 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:43:00 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:43:00 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:00 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:00 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:43:01 localhost podman[292082]: Feb 20 04:43:01 localhost podman[292082]: 2026-02-20 09:43:01.103665726 +0000 UTC m=+0.074450063 container create 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, version=7, RELEASE=main, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1770267347) Feb 20 04:43:01 localhost systemd[1]: Started libpod-conmon-802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2.scope. Feb 20 04:43:01 localhost systemd[1]: Started libcrun container. Feb 20 04:43:01 localhost podman[292082]: 2026-02-20 09:43:01.167584603 +0000 UTC m=+0.138368890 container init 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , release=1770267347, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public) Feb 20 04:43:01 localhost podman[292082]: 2026-02-20 09:43:01.07309878 +0000 UTC m=+0.043883097 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:43:01 localhost podman[292082]: 2026-02-20 09:43:01.177715866 +0000 UTC m=+0.148500153 container start 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_CLEAN=True) Feb 20 04:43:01 localhost podman[292082]: 2026-02-20 09:43:01.178217352 +0000 UTC m=+0.149001669 container attach 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z) Feb 20 04:43:01 localhost brave_clarke[292097]: 167 167 Feb 20 04:43:01 localhost systemd[1]: libpod-802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2.scope: Deactivated successfully. Feb 20 04:43:01 localhost podman[292082]: 2026-02-20 09:43:01.181365346 +0000 UTC m=+0.152149663 container died 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, version=7, GIT_CLEAN=True, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container) Feb 20 04:43:01 localhost systemd[1]: var-lib-containers-storage-overlay-3c4e09e03ff688918d040ca7f542af7a49eeb95eb68544f947d554e00306a41c-merged.mount: Deactivated successfully. Feb 20 04:43:01 localhost systemd[1]: tmp-crun.dvpS09.mount: Deactivated successfully. Feb 20 04:43:01 localhost systemd[1]: var-lib-containers-storage-overlay-f748b64a3207ee2b2c1e27a7d254cc878c50407a5d8a5ee1c916bad482448952-merged.mount: Deactivated successfully. Feb 20 04:43:01 localhost podman[292102]: 2026-02-20 09:43:01.295050985 +0000 UTC m=+0.105101952 container remove 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vcs-type=git, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Feb 20 04:43:01 localhost systemd[1]: libpod-conmon-802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2.scope: Deactivated successfully. Feb 20 04:43:01 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:43:01 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:43:01 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:01 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:43:03 localhost podman[292186]: 2026-02-20 09:43:03.151405564 +0000 UTC m=+0.082819574 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:43:03 localhost podman[292186]: 2026-02-20 09:43:03.162393064 +0000 UTC m=+0.093807094 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:43:03 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:43:04 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:04 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:04 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:05 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:05 localhost nova_compute[281288]: 2026-02-20 09:43:05.263 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:05 localhost nova_compute[281288]: 2026-02-20 09:43:05.265 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:05 localhost nova_compute[281288]: 2026-02-20 09:43:05.265 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:05 localhost nova_compute[281288]: 2026-02-20 09:43:05.265 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:05 localhost nova_compute[281288]: 2026-02-20 09:43:05.266 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:05 localhost nova_compute[281288]: 2026-02-20 09:43:05.268 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:43:06.006 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:43:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:43:06.006 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:43:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:43:06.007 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: Reconfig service osd.default_drive_group Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:06 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:43:07 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 e86: 6 total, 6 up, 6 in Feb 20 04:43:07 localhost systemd[1]: session-65.scope: Deactivated successfully. Feb 20 04:43:07 localhost systemd[1]: session-65.scope: Consumed 18.516s CPU time. Feb 20 04:43:07 localhost systemd-logind[759]: Session 65 logged out. Waiting for processes to exit. Feb 20 04:43:07 localhost systemd-logind[759]: Removed session 65. Feb 20 04:43:07 localhost ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)... Feb 20 04:43:07 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain Feb 20 04:43:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' Feb 20 04:43:07 localhost ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:43:07 localhost ceph-mon[288586]: from='client.? 172.18.0.200:0/863103056' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:43:07 localhost ceph-mon[288586]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:43:07 localhost ceph-mon[288586]: Activating manager daemon np0005625199.ileebh Feb 20 04:43:07 localhost ceph-mon[288586]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 20 04:43:07 localhost sshd[292547]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:43:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:43:10 localhost podman[292548]: 2026-02-20 09:43:10.144369226 +0000 UTC m=+0.081594368 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:43:10 localhost podman[292548]: 2026-02-20 09:43:10.157114808 +0000 UTC m=+0.094339940 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:43:10 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:43:10 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:10 localhost nova_compute[281288]: 2026-02-20 09:43:10.298 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:10 localhost nova_compute[281288]: 2026-02-20 09:43:10.300 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:10 localhost nova_compute[281288]: 2026-02-20 09:43:10.301 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:10 localhost nova_compute[281288]: 2026-02-20 09:43:10.301 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:10 localhost nova_compute[281288]: 2026-02-20 09:43:10.302 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:10 localhost nova_compute[281288]: 2026-02-20 09:43:10.305 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:43:12 localhost systemd[1]: tmp-crun.SJzBBv.mount: Deactivated successfully. Feb 20 04:43:12 localhost podman[292570]: 2026-02-20 09:43:12.144286511 +0000 UTC m=+0.084015270 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, config_id=openstack_network_exporter, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64) Feb 20 04:43:12 localhost podman[292570]: 2026-02-20 09:43:12.162061153 +0000 UTC m=+0.101789962 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1770267347, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:43:12 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:43:15 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:15 localhost nova_compute[281288]: 2026-02-20 09:43:15.303 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:43:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:43:16 localhost systemd[1]: tmp-crun.JQQxd9.mount: Deactivated successfully. Feb 20 04:43:16 localhost podman[292591]: 2026-02-20 09:43:16.152875551 +0000 UTC m=+0.091013380 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Feb 20 04:43:16 localhost podman[292591]: 2026-02-20 09:43:16.185321744 +0000 UTC m=+0.123459553 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 20 04:43:16 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:43:16 localhost podman[292590]: 2026-02-20 09:43:16.208486968 +0000 UTC m=+0.148428401 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:43:16 localhost podman[292590]: 2026-02-20 09:43:16.3012843 +0000 UTC m=+0.241225763 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:43:16 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:43:17 localhost systemd[1]: Stopping User Manager for UID 1002... Feb 20 04:43:17 localhost systemd[26592]: Activating special unit Exit the Session... Feb 20 04:43:17 localhost systemd[26592]: Removed slice User Background Tasks Slice. Feb 20 04:43:17 localhost systemd[26592]: Stopped target Main User Target. Feb 20 04:43:17 localhost systemd[26592]: Stopped target Basic System. Feb 20 04:43:17 localhost systemd[26592]: Stopped target Paths. Feb 20 04:43:17 localhost systemd[26592]: Stopped target Sockets. Feb 20 04:43:17 localhost systemd[26592]: Stopped target Timers. Feb 20 04:43:17 localhost systemd[26592]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 20 04:43:17 localhost systemd[26592]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 04:43:17 localhost systemd[26592]: Closed D-Bus User Message Bus Socket. Feb 20 04:43:17 localhost systemd[26592]: Stopped Create User's Volatile Files and Directories. Feb 20 04:43:17 localhost systemd[26592]: Removed slice User Application Slice. Feb 20 04:43:17 localhost systemd[26592]: Reached target Shutdown. Feb 20 04:43:17 localhost systemd[26592]: Finished Exit the Session. Feb 20 04:43:17 localhost systemd[26592]: Reached target Exit the Session. Feb 20 04:43:17 localhost systemd[1]: user@1002.service: Deactivated successfully. Feb 20 04:43:17 localhost systemd[1]: Stopped User Manager for UID 1002. Feb 20 04:43:17 localhost systemd[1]: user@1002.service: Consumed 11.619s CPU time, read 0B from disk, written 7.0K to disk. Feb 20 04:43:17 localhost systemd[1]: Stopping User Runtime Directory /run/user/1002... Feb 20 04:43:17 localhost systemd[1]: run-user-1002.mount: Deactivated successfully. Feb 20 04:43:17 localhost systemd[1]: user-runtime-dir@1002.service: Deactivated successfully. Feb 20 04:43:17 localhost systemd[1]: Stopped User Runtime Directory /run/user/1002. Feb 20 04:43:17 localhost systemd[1]: Removed slice User Slice of UID 1002. Feb 20 04:43:17 localhost systemd[1]: user-1002.slice: Consumed 3min 57.069s CPU time. Feb 20 04:43:17 localhost podman[241968]: time="2026-02-20T09:43:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:43:17 localhost podman[241968]: @ - - [20/Feb/2026:09:43:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:43:17 localhost podman[241968]: @ - - [20/Feb/2026:09:43:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18254 "" "Go-http-client/1.1" Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.206 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.213 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baed3cf0-0519-4d64-b0aa-7fe254dec918', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.208001', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '95faf6d2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '9bc7dffff159f89e07b21258329444f84059afe8234c9e7f7e79ab50c09c1236'}]}, 'timestamp': '2026-02-20 09:43:18.214749', '_unique_id': '7b7f6041f4b44af5a89378186d8a5749'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.217 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '531b533e-887b-474f-a6f2-a92b598cbaa3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.217694', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '95fb814c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'daf4cce22be5cb3551d73e43beded9fbec350b1c232e1d21ea0428ae404830fe'}]}, 'timestamp': '2026-02-20 09:43:18.218207', '_unique_id': 'ab4d8c58e04d4103a12e198e2d1aea44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.250 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.251 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cfd1ca9-402e-4fe0-8855-0c6857f43787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.220349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96008872-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '6e66dce6ac5ec77b182591fce5909c77df4bef9a8ba8f0a8c9d28ed1c6202527'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.220349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9600992a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '2599f8b3db8133089deab6e687f6ac13104386e4edf293ff42dbaddb664ed1d0'}]}, 'timestamp': '2026-02-20 09:43:18.251553', '_unique_id': 'ccc99a28f5d64d8197269646b4594167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.264 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.265 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6f2ab90-5687-49e3-ad36-4b550e174853', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.253820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9602b6f6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '4c3cb180e0ed8d2f813d1ca4227f42b75a11f869380d8d3bf49bb960d942bd6a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.253820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9602c8d0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': 'e37565ac37d0cfe0b88f66d8fb08899ddb4bb155f3d9a359235818a5331397f0'}]}, 'timestamp': '2026-02-20 09:43:18.265922', '_unique_id': '64266122b76948b2bb269fd430622654'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0af25826-ac38-4447-ba89-ed9cbffb4c1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.268120', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960331da-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '0fb4daf7647d6c53e0ccc3a095f2d43bf1c5f674ae1ba8f1206fc0a73608f363'}]}, 'timestamp': '2026-02-20 09:43:18.268595', '_unique_id': '55513b72ba8a4e4db4da9f63ba513803'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.285 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 13990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8ff7006-6faa-4148-a2d5-8cd224e07557', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13990000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:43:18.270759', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9605c6a2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.524129167, 'message_signature': 'f29f28b657a17f4b9c0e4a7b4c72a4555eb2cd02d32d2248dd95d034fb0811dd'}]}, 'timestamp': '2026-02-20 09:43:18.285500', '_unique_id': '15fe47e4c8434cd08383dd042cbe2710'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.287 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.287 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.288 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfe34221-283f-40e8-8573-e089067c8acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.287994', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960639f2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'c30f8ec4dcba8988f68786e481ae67826ca2ef12b021ea6a08391ed2abcc3921'}]}, 'timestamp': '2026-02-20 09:43:18.288464', '_unique_id': '949625cbd2b2428683fd15605ae9d63c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b59db32-4a96-450a-8926-02cfb7d2ac57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.290578', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '96069ff0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '3f3a59cf4275d2204616a86eff7edfde05a9df9d0e0e6138f9a261a315628a33'}]}, 'timestamp': '2026-02-20 09:43:18.291077', '_unique_id': '411f0307068e4ca59c639b3ac21d483b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8574e3e8-5f11-412f-9017-612e9d0643df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.293404', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '96070ce2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'd661d49f779d9581bad5ff0086481ebc3f757406f3db35560a56ac1a2bf4a20a'}]}, 'timestamp': '2026-02-20 09:43:18.293919', '_unique_id': '320e8414064f47bfa7d9079b7a203325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e6dd7e7-a2b5-4e27-bbd8-96dd0898b602', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.295992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960771c8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '310202cb0cc557f12deab0d6169c439c2d5222de2a9636e60647bc45eae09a66'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.295992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960781c2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '1fd1b530f7d3431f91290ca64ec1fddabcf8e6b1d8353fcb78802673e18638ec'}]}, 'timestamp': '2026-02-20 09:43:18.296857', '_unique_id': '83994accae994c69b7eca1d217b2a876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fecb5b6-97d1-4f6d-93dd-2c96588c139b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.298995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9607e720-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'a62cc5dcbc2a430e4a2e8ddb9af8115102175a51aeed1a0c191a8752e7aa914e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.298995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9607f710-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '62f0a3f86e34d365aa75455558dea6a38a328159390bd57d019d660142f0f4ac'}]}, 'timestamp': '2026-02-20 09:43:18.299857', '_unique_id': 'ff6324fa509f416d9b08851ac4d387ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c64ace2b-fd18-4967-99f5-4a91c1227c80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.302043', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '96085e62-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '2e751a97eda1c247112db5464b5520ae432b8006cbb313ca156f3738a7bff5c6'}]}, 'timestamp': '2026-02-20 09:43:18.302502', '_unique_id': '5df6b4c8d7f549d68cc8ee6bcfa31ad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.305 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.305 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20eda941-73b1-4129-a662-576a87825324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.305030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9608d4f0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'da9cdc032db6075d704c3e33a9de75ef62d5aba9aeb6a7a983fe00e71b8070ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.305030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9608e6f2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '30b0c4fcd9019ee7962ac6141d4a79d7545603d6e58ad7a9a41469c214678b36'}]}, 'timestamp': '2026-02-20 09:43:18.306001', '_unique_id': '7779f90e5acb42e5bef2b74f3fe96385'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.308 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26a5b4b9-e545-4c73-bc7b-3ec82b08f296', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.308132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96094bec-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '18ca41596036b9fc7772f00f7a8bbfab56f3204611eb65936daa1cb01c398aba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.308132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96095d26-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '8ff85905451a98fdc361163ad5151d76596e58f213949234e7eec928358f4c89'}]}, 'timestamp': '2026-02-20 09:43:18.308997', '_unique_id': 'ff66cdabedb54dfda9d7840d7fd560a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.310 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e8223d4-b32d-4607-954d-e9ef49338a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.311104', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '9609c054-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '9a30cd04ba3588994ec10fd8eb2d747bdcf7e342901af91502bca811fb2c8030'}]}, 'timestamp': '2026-02-20 09:43:18.311559', '_unique_id': 'd0b4c744fd954bb782fb186fffc45728'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba3964db-bc0c-49c6-a858-a43dce90c384', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.313593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960a227e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '6cc696e2a4b96d668cf28c3d9a47f4f88298a0e322bb0212c1ccb858fbbe6b06'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.313593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960a3386-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '4f16917163a53b1520a0476f0a8e0789bc1fc0ec8acdd1d0e9ebb6ab1bf4e200'}]}, 'timestamp': '2026-02-20 09:43:18.314482', '_unique_id': 'e0d02c335ecd4f8b8e1706310121bc98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ae96e96-7c1c-4b2e-a048-342642bfc82e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.316773', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960a9dee-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '0c478b48081fcd1eb084878c10b4cbb9d3e4d851b0b06132809ada16f8be3720'}]}, 'timestamp': '2026-02-20 09:43:18.317237', '_unique_id': '2461ce13138347acaf519a75b6097833'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7d38276-7115-489b-a2f9-5dfc62c1d94e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:43:18.319397', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '960b077a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.524129167, 'message_signature': '74afc0aee02c01a1c5ba85c4166cfcc25472d5573be97e2b805ea44d4d090277'}]}, 'timestamp': '2026-02-20 09:43:18.319923', '_unique_id': 'acb40f8f85d94391958ccf35899231eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a657617e-073b-4b07-be7a-34ee5a4b744b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.322041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960b6bc0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '0a5557cc5d40fd7e6919fb30581323acbfb3b3d224bb4199c8fcb994b6f18613'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.322041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960b7d7c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '62bc397318ef4fb14721e959c46d463a7080214db6c4f7bfad22508bbd629a64'}]}, 'timestamp': '2026-02-20 09:43:18.322951', '_unique_id': 'f0d42480dbe14d7783d5b200e22c94b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5994a3db-31d9-4eaa-b973-0287c1184aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.325529', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960bf7c0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'dff897eee1be37e94535e593ac5bed20306a04c1464cfd6fefbac5b2ca4889cb'}]}, 'timestamp': '2026-02-20 09:43:18.326200', '_unique_id': '87168adcec314bd3a4e2a9ddbe02ccc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24e05402-250e-4aba-a7ce-e70af4e8f413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.327818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960c49fa-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'bb6f72b465852fac262dfc752f74f0c8aede9667b90ca9864b5966095ec0da43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.327818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960c547c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'daef8fcd6fdbad86cfdf17002cda0bbc08a011ffd85c47f5ab209893764ffc9a'}]}, 'timestamp': '2026-02-20 09:43:18.328360', '_unique_id': '75afe92aca104fb0b39a9cba55eec963'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:43:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:43:20 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:20 localhost nova_compute[281288]: 2026-02-20 09:43:20.306 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:21 localhost sshd[292634]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:43:24 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Feb 20 04:43:24 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:24.961275) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:43:24 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Feb 20 04:43:24 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580604961362, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 13927, "num_deletes": 767, "total_data_size": 23964475, "memory_usage": 24959952, "flush_reason": "Manual Compaction"} Feb 20 04:43:24 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605029379, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 15075047, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 13932, "table_properties": {"data_size": 15012630, "index_size": 34791, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 275335, "raw_average_key_size": 25, "raw_value_size": 14829963, "raw_average_value_size": 1396, "num_data_blocks": 1344, "num_entries": 10618, "num_filter_entries": 10618, "num_deletions": 765, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 1771580480, "file_creation_time": 1771580604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 68153 microseconds, and 32482 cpu microseconds. Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.029433) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 15075047 bytes OK Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.029455) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.031515) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.031529) EVENT_LOG_v1 {"time_micros": 1771580605031525, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.031544) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 23876971, prev total WAL file size 23876971, number of live WAL files 2. Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.035420) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end) Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(14MB) 8(1887B)] Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605035529, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 15076934, "oldest_snapshot_seqno": -1} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9856 keys, 15063228 bytes, temperature: kUnknown Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605107313, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 15063228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15002809, "index_size": 34718, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24645, "raw_key_size": 262283, "raw_average_key_size": 26, "raw_value_size": 14830021, "raw_average_value_size": 1504, "num_data_blocks": 1342, "num_entries": 9856, "num_filter_entries": 9856, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580605, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.107972) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 15063228 bytes Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.109902) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 209.4 rd, 209.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(14.4, 0.0 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10623, records dropped: 767 output_compression: NoCompression Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.109941) EVENT_LOG_v1 {"time_micros": 1771580605109922, "job": 4, "event": "compaction_finished", "compaction_time_micros": 72012, "compaction_time_cpu_micros": 40311, "output_level": 6, "num_output_files": 1, "total_output_size": 15063228, "num_input_records": 10623, "num_output_records": 9856, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605113205, "job": 4, "event": "table_file_deletion", "file_number": 14} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605113308, "job": 4, "event": "table_file_deletion", "file_number": 8} Feb 20 04:43:25 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.035286) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:43:25 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:25 localhost nova_compute[281288]: 2026-02-20 09:43:25.308 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:25 localhost nova_compute[281288]: 2026-02-20 09:43:25.310 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:25 localhost nova_compute[281288]: 2026-02-20 09:43:25.311 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:25 localhost nova_compute[281288]: 2026-02-20 09:43:25.311 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:25 localhost nova_compute[281288]: 2026-02-20 09:43:25.353 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:25 localhost nova_compute[281288]: 2026-02-20 09:43:25.353 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:43:26 localhost podman[292636]: 2026-02-20 09:43:26.142168883 +0000 UTC m=+0.081155105 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:43:26 localhost podman[292636]: 2026-02-20 09:43:26.158519493 +0000 UTC m=+0.097505755 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:43:26 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:43:26 localhost openstack_network_exporter[244414]: ERROR 09:43:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:43:26 localhost openstack_network_exporter[244414]: Feb 20 04:43:26 localhost openstack_network_exporter[244414]: ERROR 09:43:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:43:26 localhost openstack_network_exporter[244414]: Feb 20 04:43:30 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:30 localhost nova_compute[281288]: 2026-02-20 09:43:30.355 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:30 localhost nova_compute[281288]: 2026-02-20 09:43:30.357 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:30 localhost nova_compute[281288]: 2026-02-20 09:43:30.357 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:30 localhost nova_compute[281288]: 2026-02-20 09:43:30.357 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:30 localhost nova_compute[281288]: 2026-02-20 09:43:30.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:30 localhost nova_compute[281288]: 2026-02-20 09:43:30.361 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:43:34 localhost podman[292656]: 2026-02-20 09:43:34.159963179 +0000 UTC m=+0.087057661 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:43:34 localhost podman[292656]: 2026-02-20 09:43:34.171100423 +0000 UTC m=+0.098194915 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:43:34 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:43:35 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:35 localhost nova_compute[281288]: 2026-02-20 09:43:35.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:39 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 e87: 6 total, 6 up, 6 in Feb 20 04:43:40 localhost ceph-mon[288586]: Activating manager daemon np0005625200.ypbkax Feb 20 04:43:40 localhost ceph-mon[288586]: Manager daemon np0005625199.ileebh is unresponsive, replacing it with standby daemon np0005625200.ypbkax Feb 20 04:43:40 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:40 localhost nova_compute[281288]: 2026-02-20 09:43:40.363 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:40 localhost nova_compute[281288]: 2026-02-20 09:43:40.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:40 localhost nova_compute[281288]: 2026-02-20 09:43:40.366 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:40 localhost nova_compute[281288]: 2026-02-20 09:43:40.366 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:40 localhost nova_compute[281288]: 2026-02-20 09:43:40.399 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:40 localhost nova_compute[281288]: 2026-02-20 09:43:40.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:40 localhost sshd[292679]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:43:40 localhost systemd[1]: Created slice User Slice of UID 1002. Feb 20 04:43:40 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Feb 20 04:43:40 localhost systemd-logind[759]: New session 66 of user ceph-admin. Feb 20 04:43:40 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Feb 20 04:43:40 localhost systemd[1]: Starting User Manager for UID 1002... Feb 20 04:43:40 localhost podman[292681]: 2026-02-20 09:43:40.599424195 +0000 UTC m=+0.100422892 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:43:40 localhost podman[292681]: 2026-02-20 09:43:40.61428363 +0000 UTC m=+0.115282367 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:43:40 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:43:40 localhost systemd[292696]: Queued start job for default target Main User Target. Feb 20 04:43:40 localhost systemd[292696]: Created slice User Application Slice. Feb 20 04:43:40 localhost systemd[292696]: Started Mark boot as successful after the user session has run 2 minutes. Feb 20 04:43:40 localhost systemd[292696]: Started Daily Cleanup of User's Temporary Directories. Feb 20 04:43:40 localhost systemd[292696]: Reached target Paths. Feb 20 04:43:40 localhost systemd[292696]: Reached target Timers. Feb 20 04:43:40 localhost systemd[292696]: Starting D-Bus User Message Bus Socket... Feb 20 04:43:40 localhost systemd[292696]: Starting Create User's Volatile Files and Directories... Feb 20 04:43:40 localhost systemd[292696]: Listening on D-Bus User Message Bus Socket. Feb 20 04:43:40 localhost systemd[292696]: Reached target Sockets. Feb 20 04:43:40 localhost systemd[292696]: Finished Create User's Volatile Files and Directories. Feb 20 04:43:40 localhost systemd[292696]: Reached target Basic System. Feb 20 04:43:40 localhost systemd[292696]: Reached target Main User Target. Feb 20 04:43:40 localhost systemd[292696]: Startup finished in 144ms. Feb 20 04:43:40 localhost systemd[1]: Started User Manager for UID 1002. Feb 20 04:43:40 localhost systemd[1]: Started Session 66 of User ceph-admin. Feb 20 04:43:41 localhost ceph-mon[288586]: Manager daemon np0005625200.ypbkax is now available Feb 20 04:43:41 localhost ceph-mon[288586]: removing stray HostCache host record np0005625199.localdomain.devices.0 Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"}]': finished Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"}]': finished Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/mirror_snapshot_schedule"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/mirror_snapshot_schedule"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/trash_purge_schedule"} : dispatch Feb 20 04:43:41 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/trash_purge_schedule"} : dispatch Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:42 localhost ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Bus STARTING Feb 20 04:43:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:43:42 localhost podman[292857]: 2026-02-20 09:43:42.556729291 +0000 UTC m=+0.085556726 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:43:42 localhost podman[292857]: 2026-02-20 09:43:42.641145742 +0000 UTC m=+0.169973147 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, managed_by=edpm_ansible, version=9.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public) Feb 20 04:43:42 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:43:42 localhost systemd[1]: tmp-crun.4x2imb.mount: Deactivated successfully. Feb 20 04:43:42 localhost podman[292906]: 2026-02-20 09:43:42.786547472 +0000 UTC m=+0.100934488 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:43:42 localhost podman[292906]: 2026-02-20 09:43:42.898061425 +0000 UTC m=+0.212448511 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:43:43 localhost ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Serving on https://172.18.0.104:7150 Feb 20 04:43:43 localhost ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Client ('172.18.0.104', 40458) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:43:43 localhost ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Serving on http://172.18.0.104:8765 Feb 20 04:43:43 localhost ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Bus STARTED Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:43 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:44 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.443 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.445 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.449 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:45 localhost ceph-mon[288586]: Saving service mon spec with placement label:mon Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:43:45 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:43:45 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:43:45 localhost nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:43:46 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:43:46 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/117158667' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.210 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.269 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.270 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:43:46 localhost podman[293456]: 2026-02-20 09:43:46.421746207 +0000 UTC m=+0.113535835 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:43:46 localhost podman[293456]: 2026-02-20 09:43:46.463095407 +0000 UTC m=+0.154885035 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:43:46 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.538 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.540 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11867MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.540 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.542 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:43:46 localhost podman[293493]: 2026-02-20 09:43:46.558790186 +0000 UTC m=+0.132244106 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 20 04:43:46 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:43:46 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:43:46 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:43:46 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:46 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:46 localhost podman[293493]: 2026-02-20 09:43:46.637991021 +0000 UTC m=+0.211444951 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:43:46 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.657 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.657 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:43:46 localhost nova_compute[281288]: 2026-02-20 09:43:46.707 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:43:47 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:43:47 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1470667621' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:43:47 localhost nova_compute[281288]: 2026-02-20 09:43:47.189 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:43:47 localhost nova_compute[281288]: 2026-02-20 09:43:47.197 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:43:47 localhost nova_compute[281288]: 2026-02-20 09:43:47.220 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:43:47 localhost nova_compute[281288]: 2026-02-20 09:43:47.223 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:43:47 localhost nova_compute[281288]: 2026-02-20 09:43:47.224 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:43:47 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:43:47 localhost podman[241968]: time="2026-02-20T09:43:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:43:47 localhost podman[241968]: @ - - [20/Feb/2026:09:43:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:43:47 localhost podman[241968]: @ - - [20/Feb/2026:09:43:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18258 "" "Go-http-client/1.1" Feb 20 04:43:48 localhost nova_compute[281288]: 2026-02-20 09:43:48.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:48 localhost nova_compute[281288]: 2026-02-20 09:43:48.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:48 localhost nova_compute[281288]: 2026-02-20 09:43:48.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:48 localhost nova_compute[281288]: 2026-02-20 09:43:48.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:48 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:43:48 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:43:48 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:43:48 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:43:48 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:48 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:43:49 localhost nova_compute[281288]: 2026-02-20 09:43:49.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:49 localhost nova_compute[281288]: 2026-02-20 09:43:49.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:43:49 localhost nova_compute[281288]: 2026-02-20 09:43:49.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:43:49 localhost ceph-mon[288586]: Reconfiguring mon.np0005625200 (monmap changed)... Feb 20 04:43:49 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625200 on np0005625200.localdomain Feb 20 04:43:49 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:49 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:49 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:43:49 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.028 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.028 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.029 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.029 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:43:50 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.433 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.445 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.449 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.452 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.452 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.453 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.454 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:50 localhost nova_compute[281288]: 2026-02-20 09:43:50.454 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:43:50 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)... Feb 20 04:43:50 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain Feb 20 04:43:50 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:50 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:50 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:50 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:43:51 localhost nova_compute[281288]: 2026-02-20 09:43:51.449 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:51 localhost nova_compute[281288]: 2026-02-20 09:43:51.450 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:43:51 localhost ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)... Feb 20 04:43:51 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:43:51 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:51 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:51 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:43:51 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:43:52 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)... Feb 20 04:43:52 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain Feb 20 04:43:52 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:52 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:52 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:43:52 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:43:53 localhost ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)... Feb 20 04:43:53 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain Feb 20 04:43:53 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:53 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' Feb 20 04:43:53 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:43:53 localhost ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:43:53 localhost ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:43:53 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:43:54 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 e88: 6 total, 6 up, 6 in Feb 20 04:43:54 localhost systemd[1]: session-66.scope: Deactivated successfully. Feb 20 04:43:54 localhost systemd[1]: session-66.scope: Consumed 6.710s CPU time. Feb 20 04:43:54 localhost systemd-logind[759]: Session 66 logged out. Waiting for processes to exit. Feb 20 04:43:54 localhost systemd-logind[759]: Removed session 66. Feb 20 04:43:54 localhost sshd[293896]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:43:54 localhost systemd-logind[759]: New session 68 of user ceph-admin. Feb 20 04:43:54 localhost systemd[1]: Started Session 68 of User ceph-admin. Feb 20 04:43:54 localhost ceph-mon[288586]: from='client.? 172.18.0.200:0/3880794004' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:43:54 localhost ceph-mon[288586]: Activating manager daemon np0005625202.arwxwo Feb 20 04:43:54 localhost ceph-mon[288586]: from='client.? 172.18.0.200:0/3880794004' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 20 04:43:54 localhost ceph-mon[288586]: Manager daemon np0005625202.arwxwo is now available Feb 20 04:43:54 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/mirror_snapshot_schedule"} : dispatch Feb 20 04:43:54 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/mirror_snapshot_schedule"} : dispatch Feb 20 04:43:54 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/trash_purge_schedule"} : dispatch Feb 20 04:43:54 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/trash_purge_schedule"} : dispatch Feb 20 04:43:55 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:43:55 localhost nova_compute[281288]: 2026-02-20 09:43:55.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:55 localhost nova_compute[281288]: 2026-02-20 09:43:55.480 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:43:55 localhost nova_compute[281288]: 2026-02-20 09:43:55.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:43:55 localhost nova_compute[281288]: 2026-02-20 09:43:55.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:55 localhost nova_compute[281288]: 2026-02-20 09:43:55.483 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:43:55 localhost nova_compute[281288]: 2026-02-20 09:43:55.483 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:43:55 localhost systemd[1]: tmp-crun.8BfIHi.mount: Deactivated successfully. Feb 20 04:43:55 localhost podman[294004]: 2026-02-20 09:43:55.628188687 +0000 UTC m=+0.096686570 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, name=rhceph, version=7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64) Feb 20 04:43:55 localhost podman[294004]: 2026-02-20 09:43:55.758160614 +0000 UTC m=+0.226658487 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1770267347, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, RELEASE=main, io.buildah.version=1.42.2, vcs-type=git, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:43:56 localhost ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Bus STARTING Feb 20 04:43:56 localhost ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Serving on http://172.18.0.106:8765 Feb 20 04:43:56 localhost ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Serving on https://172.18.0.106:7150 Feb 20 04:43:56 localhost ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Bus STARTED Feb 20 04:43:56 localhost ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Client ('172.18.0.106', 42946) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:43:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:43:56 localhost podman[294092]: 2026-02-20 09:43:56.298242828 +0000 UTC m=+0.094670050 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:43:56 localhost podman[294092]: 2026-02-20 09:43:56.30800384 +0000 UTC m=+0.104431092 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:43:56 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:43:56 localhost openstack_network_exporter[244414]: ERROR 09:43:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:43:56 localhost openstack_network_exporter[244414]: Feb 20 04:43:56 localhost openstack_network_exporter[244414]: ERROR 09:43:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:43:56 localhost openstack_network_exporter[244414]: Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:57 localhost sshd[294262]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:43:58 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:43:58 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:43:58 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:43:58 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:43:58 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:43:58 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:58 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:58 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:58 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:58 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:43:58 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:44:00 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:44:00 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:00 localhost nova_compute[281288]: 2026-02-20 09:44:00.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:00 localhost nova_compute[281288]: 2026-02-20 09:44:00.486 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:00 localhost nova_compute[281288]: 2026-02-20 09:44:00.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:00 localhost nova_compute[281288]: 2026-02-20 09:44:00.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:00 localhost nova_compute[281288]: 2026-02-20 09:44:00.538 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:00 localhost nova_compute[281288]: 2026-02-20 09:44:00.539 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:01 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:44:01 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:44:01 localhost ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:44:01 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:44:01 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:02 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:02 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:44:02 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:02 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:44:03 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:03 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:03 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:44:03 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:44:03 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:44:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:04 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:44:04 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:44:04 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:44:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:04 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Feb 20 04:44:04 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:04.997236) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:44:04 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Feb 20 04:44:04 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580644997355, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1806, "num_deletes": 254, "total_data_size": 9515829, "memory_usage": 10070552, "flush_reason": "Manual Compaction"} Feb 20 04:44:04 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645029221, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5847284, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13939, "largest_seqno": 15738, "table_properties": {"data_size": 5839303, "index_size": 4614, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 20862, "raw_average_key_size": 22, "raw_value_size": 5822041, "raw_average_value_size": 6383, "num_data_blocks": 197, "num_entries": 912, "num_filter_entries": 912, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580605, "oldest_key_time": 1771580605, "file_creation_time": 1771580644, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 32048 microseconds, and 11488 cpu microseconds. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.029296) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5847284 bytes OK Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.029336) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.031159) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.031182) EVENT_LOG_v1 {"time_micros": 1771580645031176, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.031214) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 9506701, prev total WAL file size 9514805, number of live WAL files 2. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.033035) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353036' seq:72057594037927935, type:22 .. '6D6772737461740033373537' seq:0, type:0; will stop at (end) Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5710KB)], [15(14MB)] Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645033115, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 20910512, "oldest_snapshot_seqno": -1} Feb 20 04:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10235 keys, 18649997 bytes, temperature: kUnknown Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645121184, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 18649997, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18590071, "index_size": 33265, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 271622, "raw_average_key_size": 26, "raw_value_size": 18413740, "raw_average_value_size": 1799, "num_data_blocks": 1289, "num_entries": 10235, "num_filter_entries": 10235, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.121536) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 18649997 bytes Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.123036) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.1 rd, 211.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.6, 14.4 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(6.8) write-amplify(3.2) OK, records in: 10768, records dropped: 533 output_compression: NoCompression Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.123063) EVENT_LOG_v1 {"time_micros": 1771580645123052, "job": 6, "event": "compaction_finished", "compaction_time_micros": 88184, "compaction_time_cpu_micros": 49158, "output_level": 6, "num_output_files": 1, "total_output_size": 18649997, "num_input_records": 10768, "num_output_records": 10235, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645123947, "job": 6, "event": "table_file_deletion", "file_number": 17} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645126158, "job": 6, "event": "table_file_deletion", "file_number": 15} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.032874) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126851) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645126974, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 275, "num_deletes": 264, "total_data_size": 20879, "memory_usage": 28472, "flush_reason": "Manual Compaction"} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645129475, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 13496, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15740, "largest_seqno": 16013, "table_properties": {"data_size": 11691, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4192, "raw_average_key_size": 15, "raw_value_size": 8092, "raw_average_value_size": 29, "num_data_blocks": 2, "num_entries": 274, "num_filter_entries": 274, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580645, "oldest_key_time": 1771580645, "file_creation_time": 1771580645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 2635 microseconds, and 1199 cpu microseconds. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.129517) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 13496 bytes OK Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.129540) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131089) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131105) EVENT_LOG_v1 {"time_micros": 1771580645131101, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131133) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 18714, prev total WAL file size 18714, number of live WAL files 2. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131736) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303039' seq:72057594037927935, type:22 .. '6B760031323734' seq:0, type:0; will stop at (end) Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(13KB)], [18(17MB)] Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645131778, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18663493, "oldest_snapshot_seqno": -1} Feb 20 04:44:05 localhost podman[294923]: 2026-02-20 09:44:05.150336912 +0000 UTC m=+0.087465153 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:44:05 localhost podman[294923]: 2026-02-20 09:44:05.188118125 +0000 UTC m=+0.125246376 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:44:05 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 9972 keys, 17685845 bytes, temperature: kUnknown Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645203821, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17685845, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17629019, "index_size": 30805, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 267681, "raw_average_key_size": 26, "raw_value_size": 17458464, "raw_average_value_size": 1750, "num_data_blocks": 1165, "num_entries": 9972, "num_filter_entries": 9972, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.204200) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17685845 bytes Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.207188) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 258.5 rd, 244.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 17.8 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(2693.3) write-amplify(1310.5) OK, records in: 10509, records dropped: 537 output_compression: NoCompression Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.207209) EVENT_LOG_v1 {"time_micros": 1771580645207199, "job": 8, "event": "compaction_finished", "compaction_time_micros": 72205, "compaction_time_cpu_micros": 37141, "output_level": 6, "num_output_files": 1, "total_output_size": 17685845, "num_input_records": 10509, "num_output_records": 9972, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645207350, "job": 8, "event": "table_file_deletion", "file_number": 20} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645209149, "job": 8, "event": "table_file_deletion", "file_number": 18} Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209246) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:05 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:44:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:05 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:44:05 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:05 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:44:05 localhost nova_compute[281288]: 2026-02-20 09:44:05.538 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:44:06.007 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:44:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:44:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:44:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:44:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:06 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:06 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:06 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:44:07 localhost ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:44:07 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:44:07 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:07 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:07 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:07 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:08 localhost ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:44:08 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:44:08 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:08 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:08 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:44:09 localhost ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:44:09 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:44:09 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:09 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:09 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:09 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:09 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:44:10 localhost ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:10 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Feb 20 04:44:10 localhost ceph-mon[288586]: mon.np0005625204@2(peon) e10 my rank is now 1 (was 2) Feb 20 04:44:10 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44000 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Feb 20 04:44:10 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44160 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Feb 20 04:44:10 localhost ceph-osd[32226]: --2- [v2:172.18.0.108:6800/2098983975,v1:172.18.0.108:6801/2098983975] >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x55bf91f9d800 0x55bf8ee57180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Feb 20 04:44:10 localhost nova_compute[281288]: 2026-02-20 09:44:10.541 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:10 localhost nova_compute[281288]: 2026-02-20 09:44:10.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:10 localhost nova_compute[281288]: 2026-02-20 09:44:10.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:10 localhost nova_compute[281288]: 2026-02-20 09:44:10.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:10 localhost nova_compute[281288]: 2026-02-20 09:44:10.585 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:10 localhost nova_compute[281288]: 2026-02-20 09:44:10.586 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:44:11 localhost podman[294946]: 2026-02-20 09:44:11.128725476 +0000 UTC m=+0.069238420 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:44:11 localhost podman[294946]: 2026-02-20 09:44:11.166102804 +0000 UTC m=+0.106615738 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:44:11 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:44:12 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:44:12 localhost ceph-mon[288586]: paxos.1).electionLogic(38) init, last seen epoch 38 Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:12 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:44:12 localhost ceph-mon[288586]: Remove daemons mon.np0005625200 Feb 20 04:44:12 localhost ceph-mon[288586]: Safe to remove mon.np0005625200: new quorum should be ['np0005625201', 'np0005625204', 'np0005625203', 'np0005625202'] (from ['np0005625201', 'np0005625204', 'np0005625203', 'np0005625202']) Feb 20 04:44:12 localhost ceph-mon[288586]: Removing monitor np0005625200 from monmap... Feb 20 04:44:12 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon rm", "name": "np0005625200"} : dispatch Feb 20 04:44:12 localhost ceph-mon[288586]: Removing daemon mon.np0005625200 from np0005625200.localdomain -- ports [] Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:44:12 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3) Feb 20 04:44:12 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:44:12 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:12 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:44:13 localhost podman[294969]: 2026-02-20 09:44:13.144583523 +0000 UTC m=+0.084000998 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:44:13 localhost podman[294969]: 2026-02-20 09:44:13.158900188 +0000 UTC m=+0.098317623 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter) Feb 20 04:44:13 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:13 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:13 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:13 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:14 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:44:14 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:44:14 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:14 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:14 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:44:15 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:15 localhost ceph-mon[288586]: Reconfiguring mon.np0005625203 (monmap changed)... Feb 20 04:44:15 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain Feb 20 04:44:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:15 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:15 localhost nova_compute[281288]: 2026-02-20 09:44:15.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:15 localhost nova_compute[281288]: 2026-02-20 09:44:15.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:15 localhost nova_compute[281288]: 2026-02-20 09:44:15.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:15 localhost nova_compute[281288]: 2026-02-20 09:44:15.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:15 localhost nova_compute[281288]: 2026-02-20 09:44:15.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:15 localhost nova_compute[281288]: 2026-02-20 09:44:15.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:15 localhost podman[295039]: Feb 20 04:44:15 localhost podman[295039]: 2026-02-20 09:44:15.924793323 +0000 UTC m=+0.079724317 container create ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, distribution-scope=public, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:44:15 localhost systemd[1]: Started libpod-conmon-ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997.scope. Feb 20 04:44:15 localhost systemd[1]: Started libcrun container. Feb 20 04:44:15 localhost podman[295039]: 2026-02-20 09:44:15.89217896 +0000 UTC m=+0.047109984 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:44:16 localhost podman[295039]: 2026-02-20 09:44:16.002138331 +0000 UTC m=+0.157069315 container init ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, release=1770267347, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, RELEASE=main) Feb 20 04:44:16 localhost podman[295039]: 2026-02-20 09:44:16.013446101 +0000 UTC m=+0.168377085 container start ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.buildah.version=1.42.2, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 20 04:44:16 localhost podman[295039]: 2026-02-20 09:44:16.013702628 +0000 UTC m=+0.168633642 container attach ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container) Feb 20 04:44:16 localhost systemd[1]: libpod-ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997.scope: Deactivated successfully. Feb 20 04:44:16 localhost reverent_noether[295054]: 167 167 Feb 20 04:44:16 localhost podman[295039]: 2026-02-20 09:44:16.0190705 +0000 UTC m=+0.174001494 container died ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, release=1770267347, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:44:16 localhost podman[295059]: 2026-02-20 09:44:16.123867755 +0000 UTC m=+0.091437278 container remove ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z) Feb 20 04:44:16 localhost systemd[1]: libpod-conmon-ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997.scope: Deactivated successfully. Feb 20 04:44:16 localhost ceph-mon[288586]: Removed label mon from host np0005625200.localdomain Feb 20 04:44:16 localhost ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:44:16 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:44:16 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:16 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:16 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:44:16 localhost podman[295141]: Feb 20 04:44:16 localhost podman[295127]: 2026-02-20 09:44:16.836768661 +0000 UTC m=+0.091548081 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:44:16 localhost podman[295141]: 2026-02-20 09:44:16.847580937 +0000 UTC m=+0.081622600 container create 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.42.2, ceph=True, build-date=2026-02-09T10:25:24Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7) Feb 20 04:44:16 localhost podman[295127]: 2026-02-20 09:44:16.868358075 +0000 UTC m=+0.123137515 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:44:16 localhost systemd[1]: Started libpod-conmon-3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114.scope. Feb 20 04:44:16 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:44:16 localhost podman[295128]: 2026-02-20 09:44:16.894942257 +0000 UTC m=+0.149597703 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:44:16 localhost systemd[1]: Started libcrun container. Feb 20 04:44:16 localhost podman[295141]: 2026-02-20 09:44:16.918805922 +0000 UTC m=+0.152847525 container init 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1770267347, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Feb 20 04:44:16 localhost podman[295141]: 2026-02-20 09:44:16.819922695 +0000 UTC m=+0.053964308 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:44:16 localhost podman[295141]: 2026-02-20 09:44:16.929127254 +0000 UTC m=+0.163168877 container start 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, GIT_CLEAN=True, GIT_BRANCH=main, release=1770267347, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:44:16 localhost podman[295141]: 2026-02-20 09:44:16.930356248 +0000 UTC m=+0.164397841 container attach 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1770267347, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:44:16 localhost competent_mirzakhani[295183]: 167 167 Feb 20 04:44:16 localhost podman[295128]: 2026-02-20 09:44:16.930485753 +0000 UTC m=+0.185141169 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-2aeb1a08d248c8a968aaccb4f356f1ba0c1d0611474ef7f1749fd8ddc73e3f34-merged.mount: Deactivated successfully. Feb 20 04:44:16 localhost systemd[1]: libpod-3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114.scope: Deactivated successfully. Feb 20 04:44:16 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:44:16 localhost podman[295141]: 2026-02-20 09:44:16.985970592 +0000 UTC m=+0.220012225 container died 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1770267347, version=7, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-6b01395a0477de3600c6caafe1127fb6de453160cbc61868ec55cd50b5eea17d-merged.mount: Deactivated successfully. Feb 20 04:44:17 localhost podman[295188]: 2026-02-20 09:44:17.021929389 +0000 UTC m=+0.081685151 container remove 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , release=1770267347, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph) Feb 20 04:44:17 localhost systemd[1]: libpod-conmon-3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114.scope: Deactivated successfully. Feb 20 04:44:17 localhost ceph-mon[288586]: Reconfiguring osd.0 (monmap changed)... Feb 20 04:44:17 localhost ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:44:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:17 localhost ceph-mon[288586]: Removed label mgr from host np0005625200.localdomain Feb 20 04:44:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:17 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:44:17 localhost podman[241968]: time="2026-02-20T09:44:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:44:17 localhost podman[241968]: @ - - [20/Feb/2026:09:44:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:44:17 localhost podman[241968]: @ - - [20/Feb/2026:09:44:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18262 "" "Go-http-client/1.1" Feb 20 04:44:17 localhost podman[295261]: Feb 20 04:44:17 localhost podman[295261]: 2026-02-20 09:44:17.880442276 +0000 UTC m=+0.064196627 container create 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1770267347, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, RELEASE=main, ceph=True, build-date=2026-02-09T10:25:24Z) Feb 20 04:44:17 localhost systemd[1]: Started libpod-conmon-4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68.scope. Feb 20 04:44:17 localhost systemd[1]: Started libcrun container. Feb 20 04:44:17 localhost podman[295261]: 2026-02-20 09:44:17.942853701 +0000 UTC m=+0.126608052 container init 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, ceph=True, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:44:17 localhost podman[295261]: 2026-02-20 09:44:17.850286163 +0000 UTC m=+0.034040554 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:44:17 localhost podman[295261]: 2026-02-20 09:44:17.952777662 +0000 UTC m=+0.136532013 container start 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64) Feb 20 04:44:17 localhost upbeat_curran[295276]: 167 167 Feb 20 04:44:17 localhost podman[295261]: 2026-02-20 09:44:17.953004999 +0000 UTC m=+0.136759350 container attach 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=) Feb 20 04:44:17 localhost systemd[1]: libpod-4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68.scope: Deactivated successfully. Feb 20 04:44:17 localhost podman[295261]: 2026-02-20 09:44:17.958858514 +0000 UTC m=+0.142612865 container died 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, release=1770267347, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:44:18 localhost systemd[1]: tmp-crun.0ZvC2d.mount: Deactivated successfully. Feb 20 04:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-7915fbe5bfbe5ba878e439366042180654b88659784cc214a7e4c22c72ebb1dc-merged.mount: Deactivated successfully. Feb 20 04:44:18 localhost podman[295281]: 2026-02-20 09:44:18.055117047 +0000 UTC m=+0.089167354 container remove 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph) Feb 20 04:44:18 localhost systemd[1]: libpod-conmon-4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68.scope: Deactivated successfully. Feb 20 04:44:18 localhost ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)... Feb 20 04:44:18 localhost ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:18 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:18 localhost podman[295358]: Feb 20 04:44:18 localhost podman[295358]: 2026-02-20 09:44:18.893843463 +0000 UTC m=+0.077318508 container create b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git) Feb 20 04:44:18 localhost systemd[1]: Started libpod-conmon-b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5.scope. Feb 20 04:44:18 localhost systemd[1]: Started libcrun container. Feb 20 04:44:18 localhost podman[295358]: 2026-02-20 09:44:18.959952904 +0000 UTC m=+0.143427949 container init b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.42.2, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Feb 20 04:44:18 localhost podman[295358]: 2026-02-20 09:44:18.86545841 +0000 UTC m=+0.048933485 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:44:18 localhost podman[295358]: 2026-02-20 09:44:18.968881506 +0000 UTC m=+0.152356561 container start b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=1770267347, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z) Feb 20 04:44:18 localhost podman[295358]: 2026-02-20 09:44:18.969347709 +0000 UTC m=+0.152822784 container attach b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64) Feb 20 04:44:18 localhost dazzling_diffie[295373]: 167 167 Feb 20 04:44:18 localhost systemd[1]: libpod-b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5.scope: Deactivated successfully. Feb 20 04:44:18 localhost podman[295358]: 2026-02-20 09:44:18.972331433 +0000 UTC m=+0.155806508 container died b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:44:19 localhost systemd[1]: var-lib-containers-storage-overlay-337adc02fe9baf23e554155ed2c678d7a7aef48b39255cae15d0e3b5a45b57c6-merged.mount: Deactivated successfully. Feb 20 04:44:19 localhost podman[295378]: 2026-02-20 09:44:19.064082409 +0000 UTC m=+0.084799399 container remove b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347) Feb 20 04:44:19 localhost systemd[1]: libpod-conmon-b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5.scope: Deactivated successfully. Feb 20 04:44:19 localhost ceph-mon[288586]: Removed label _admin from host np0005625200.localdomain Feb 20 04:44:19 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:44:19 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:44:19 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:19 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:19 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:19 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:19 localhost podman[295448]: Feb 20 04:44:19 localhost podman[295448]: 2026-02-20 09:44:19.768895238 +0000 UTC m=+0.074791857 container create f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:44:19 localhost systemd[1]: Started libpod-conmon-f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4.scope. Feb 20 04:44:19 localhost systemd[1]: Started libcrun container. Feb 20 04:44:19 localhost podman[295448]: 2026-02-20 09:44:19.828937996 +0000 UTC m=+0.134834615 container init f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, io.buildah.version=1.42.2, GIT_BRANCH=main, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:44:19 localhost podman[295448]: 2026-02-20 09:44:19.738849688 +0000 UTC m=+0.044746377 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:44:19 localhost podman[295448]: 2026-02-20 09:44:19.838213618 +0000 UTC m=+0.144110257 container start f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:44:19 localhost podman[295448]: 2026-02-20 09:44:19.838499897 +0000 UTC m=+0.144396566 container attach f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:44:19 localhost suspicious_sanderson[295463]: 167 167 Feb 20 04:44:19 localhost systemd[1]: libpod-f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4.scope: Deactivated successfully. Feb 20 04:44:19 localhost podman[295448]: 2026-02-20 09:44:19.841820841 +0000 UTC m=+0.147717490 container died f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, maintainer=Guillaume Abrioux , vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Feb 20 04:44:19 localhost sshd[295466]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:44:19 localhost podman[295469]: 2026-02-20 09:44:19.936216721 +0000 UTC m=+0.081864597 container remove f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, architecture=x86_64, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:44:19 localhost systemd[1]: libpod-conmon-f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4.scope: Deactivated successfully. Feb 20 04:44:20 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:20 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:44:20 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:44:20 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:20 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:20 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:44:20 localhost podman[295539]: Feb 20 04:44:20 localhost nova_compute[281288]: 2026-02-20 09:44:20.626 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:20 localhost nova_compute[281288]: 2026-02-20 09:44:20.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:20 localhost nova_compute[281288]: 2026-02-20 09:44:20.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:20 localhost nova_compute[281288]: 2026-02-20 09:44:20.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:20 localhost nova_compute[281288]: 2026-02-20 09:44:20.666 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:20 localhost nova_compute[281288]: 2026-02-20 09:44:20.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:20 localhost podman[295539]: 2026-02-20 09:44:20.686052893 +0000 UTC m=+0.121306143 container create ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, version=7, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, distribution-scope=public, release=1770267347, architecture=x86_64, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=) Feb 20 04:44:20 localhost podman[295539]: 2026-02-20 09:44:20.603906399 +0000 UTC m=+0.039159719 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:44:20 localhost systemd[1]: Started libpod-conmon-ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f.scope. Feb 20 04:44:20 localhost systemd[1]: Started libcrun container. Feb 20 04:44:20 localhost podman[295539]: 2026-02-20 09:44:20.749169379 +0000 UTC m=+0.184422659 container init ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:44:20 localhost podman[295539]: 2026-02-20 09:44:20.75877024 +0000 UTC m=+0.194023490 container start ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=1770267347, com.redhat.component=rhceph-container, version=7, ceph=True) Feb 20 04:44:20 localhost podman[295539]: 2026-02-20 09:44:20.758979226 +0000 UTC m=+0.194232546 container attach ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:44:20 localhost systemd[1]: tmp-crun.YWRtJV.mount: Deactivated successfully. Feb 20 04:44:20 localhost gifted_mendel[295555]: 167 167 Feb 20 04:44:20 localhost systemd[1]: libpod-ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f.scope: Deactivated successfully. Feb 20 04:44:20 localhost podman[295539]: 2026-02-20 09:44:20.763736911 +0000 UTC m=+0.198990261 container died ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, release=1770267347, vcs-type=git, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True) Feb 20 04:44:20 localhost podman[295560]: 2026-02-20 09:44:20.856266538 +0000 UTC m=+0.080266312 container remove ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, RELEASE=main, version=7, io.openshift.expose-services=, release=1770267347, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2) Feb 20 04:44:20 localhost systemd[1]: libpod-conmon-ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f.scope: Deactivated successfully. Feb 20 04:44:20 localhost systemd[1]: var-lib-containers-storage-overlay-ffe47c171a5ec023f1a2d56eb315ba19babf288d1155f2f24243248737177bbf-merged.mount: Deactivated successfully. Feb 20 04:44:21 localhost ceph-mon[288586]: Reconfiguring mon.np0005625204 (monmap changed)... Feb 20 04:44:21 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:44:21 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:21 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:23 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:23 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:23 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:44:23 localhost ceph-mon[288586]: Removing np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:23 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:23 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:23 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:23 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:23 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:23 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: Removing np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:44:24 localhost ceph-mon[288586]: Removing np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:44:24 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:24 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:24 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:24 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:24 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:25 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:25 localhost ceph-mon[288586]: Removing daemon mgr.np0005625200.ypbkax from np0005625200.localdomain -- ports [8765] Feb 20 04:44:25 localhost nova_compute[281288]: 2026-02-20 09:44:25.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:25 localhost nova_compute[281288]: 2026-02-20 09:44:25.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:25 localhost nova_compute[281288]: 2026-02-20 09:44:25.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:25 localhost nova_compute[281288]: 2026-02-20 09:44:25.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:25 localhost nova_compute[281288]: 2026-02-20 09:44:25.703 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:25 localhost nova_compute[281288]: 2026-02-20 09:44:25.704 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:26 localhost openstack_network_exporter[244414]: ERROR 09:44:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:44:26 localhost openstack_network_exporter[244414]: Feb 20 04:44:26 localhost openstack_network_exporter[244414]: ERROR 09:44:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:44:26 localhost openstack_network_exporter[244414]: Feb 20 04:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:44:27 localhost podman[295896]: 2026-02-20 09:44:27.138783873 +0000 UTC m=+0.073906242 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:44:27 localhost podman[295896]: 2026-02-20 09:44:27.14998426 +0000 UTC m=+0.085106659 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Feb 20 04:44:27 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:44:27 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "mgr.np0005625200.ypbkax"} : dispatch Feb 20 04:44:27 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "mgr.np0005625200.ypbkax"} : dispatch Feb 20 04:44:27 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005625200.ypbkax"}]': finished Feb 20 04:44:27 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:27 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:28 localhost ceph-mon[288586]: Removing key for mgr.np0005625200.ypbkax Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:29 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.043437) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670043514, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1289, "num_deletes": 251, "total_data_size": 2103769, "memory_usage": 2147472, "flush_reason": "Manual Compaction"} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670052155, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1203162, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16018, "largest_seqno": 17302, "table_properties": {"data_size": 1197410, "index_size": 2903, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15654, "raw_average_key_size": 22, "raw_value_size": 1184691, "raw_average_value_size": 1692, "num_data_blocks": 125, "num_entries": 700, "num_filter_entries": 700, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580645, "oldest_key_time": 1771580645, "file_creation_time": 1771580670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8756 microseconds, and 4192 cpu microseconds. Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.052203) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1203162 bytes OK Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.052226) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056049) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056076) EVENT_LOG_v1 {"time_micros": 1771580670056071, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056099) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2097075, prev total WAL file size 2097075, number of live WAL files 2. Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056797) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1174KB)], [21(16MB)] Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670056841, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 18889007, "oldest_snapshot_seqno": -1} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10134 keys, 15189048 bytes, temperature: kUnknown Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670125784, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15189048, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15132495, "index_size": 30148, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 272356, "raw_average_key_size": 26, "raw_value_size": 14960414, "raw_average_value_size": 1476, "num_data_blocks": 1137, "num_entries": 10134, "num_filter_entries": 10134, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.126101) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15189048 bytes Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.127712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 273.5 rd, 219.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 16.9 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(28.3) write-amplify(12.6) OK, records in: 10672, records dropped: 538 output_compression: NoCompression Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.127740) EVENT_LOG_v1 {"time_micros": 1771580670127727, "job": 10, "event": "compaction_finished", "compaction_time_micros": 69067, "compaction_time_cpu_micros": 41206, "output_level": 6, "num_output_files": 1, "total_output_size": 15189048, "num_input_records": 10672, "num_output_records": 10134, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670128029, "job": 10, "event": "table_file_deletion", "file_number": 23} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670130744, "job": 10, "event": "table_file_deletion", "file_number": 21} Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:30 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:44:30 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:30 localhost ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)... Feb 20 04:44:30 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain Feb 20 04:44:30 localhost ceph-mon[288586]: Added label _no_schedule to host np0005625200.localdomain Feb 20 04:44:30 localhost ceph-mon[288586]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625200.localdomain Feb 20 04:44:30 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:30 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:30 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:44:30 localhost nova_compute[281288]: 2026-02-20 09:44:30.705 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:30 localhost nova_compute[281288]: 2026-02-20 09:44:30.708 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:31 localhost ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)... Feb 20 04:44:31 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:44:31 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:31 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:31 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:31 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:32 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)... Feb 20 04:44:32 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:32 localhost ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)... Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:32 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain"} : dispatch Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain"} : dispatch Feb 20 04:44:32 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain"}]': finished Feb 20 04:44:32 localhost ceph-mon[288586]: Removed host np0005625200.localdomain Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:34 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:34 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:34 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:44:35 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:44:35 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:44:35 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:35 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:35 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:44:35 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:35 localhost nova_compute[281288]: 2026-02-20 09:44:35.707 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:44:36 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:44:36 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:44:36 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:36 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:36 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:36 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:36 localhost podman[295951]: 2026-02-20 09:44:36.141322561 +0000 UTC m=+0.080263612 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:44:36 localhost podman[295951]: 2026-02-20 09:44:36.154075073 +0000 UTC m=+0.093016134 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:44:36 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:44:37 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:44:37 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:44:37 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:37 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:37 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:37 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:38 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:44:38 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:44:38 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:38 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:38 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:44:39 localhost ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:44:39 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:39 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:44:40 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:40 localhost ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:44:40 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:44:40 localhost ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:44:40 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:44:40 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:40 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:40 localhost ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:44:40 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:44:40 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:44:40 localhost nova_compute[281288]: 2026-02-20 09:44:40.709 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:40 localhost nova_compute[281288]: 2026-02-20 09:44:40.711 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:40 localhost nova_compute[281288]: 2026-02-20 09:44:40.711 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:40 localhost nova_compute[281288]: 2026-02-20 09:44:40.712 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:40 localhost nova_compute[281288]: 2026-02-20 09:44:40.758 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:40 localhost nova_compute[281288]: 2026-02-20 09:44:40.759 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:41 localhost ceph-mon[288586]: Saving service mon spec with placement label:mon Feb 20 04:44:41 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:41 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:41 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:41 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:44:41 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:44:42 localhost systemd[1]: tmp-crun.M9G0kb.mount: Deactivated successfully. Feb 20 04:44:42 localhost podman[295994]: 2026-02-20 09:44:42.096729354 +0000 UTC m=+0.091760977 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:44:42 localhost podman[295994]: 2026-02-20 09:44:42.131062455 +0000 UTC m=+0.126094078 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:44:42 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:44:43 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc442c0 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Feb 20 04:44:43 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:44:43 localhost ceph-mon[288586]: paxos.1).electionLogic(40) init, last seen epoch 40 Feb 20 04:44:43 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:43 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:44:44 localhost podman[296017]: 2026-02-20 09:44:44.140144318 +0000 UTC m=+0.079410957 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:44:44 localhost podman[296017]: 2026-02-20 09:44:44.183123605 +0000 UTC m=+0.122390204 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, version=9.7, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git) Feb 20 04:44:44 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.747 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.747 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.748 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.761 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.764 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.764 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.764 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.790 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:45 localhost nova_compute[281288]: 2026-02-20 09:44:45.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:44:47 localhost systemd[1]: tmp-crun.xd8oAh.mount: Deactivated successfully. Feb 20 04:44:47 localhost podman[296046]: 2026-02-20 09:44:47.149514541 +0000 UTC m=+0.086741595 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0) Feb 20 04:44:47 localhost podman[296047]: 2026-02-20 09:44:47.201350747 +0000 UTC m=+0.136305897 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:44:47 localhost podman[296047]: 2026-02-20 09:44:47.210237538 +0000 UTC m=+0.145192688 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 20 04:44:47 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:44:47 localhost podman[296046]: 2026-02-20 09:44:47.260780187 +0000 UTC m=+0.198007241 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 20 04:44:47 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:44:47 localhost podman[241968]: time="2026-02-20T09:44:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:44:47 localhost podman[241968]: @ - - [20/Feb/2026:09:44:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:44:47 localhost podman[241968]: @ - - [20/Feb/2026:09:44:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18259 "" "Go-http-client/1.1" Feb 20 04:44:48 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:48 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:44:48 localhost ceph-mon[288586]: paxos.1).electionLogic(43) init, last seen epoch 43, mid-election, bumping Feb 20 04:44:48 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:48 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e11 handle_timecheck drop unexpected msg Feb 20 04:44:48 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:48 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:44:49 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:44:49 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3982866868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.277 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.357 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.358 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:44:49 localhost ceph-mon[288586]: Remove daemons mon.np0005625203 Feb 20 04:44:49 localhost ceph-mon[288586]: Safe to remove mon.np0005625203: new quorum should be ['np0005625201', 'np0005625204', 'np0005625202'] (from ['np0005625201', 'np0005625204', 'np0005625202']) Feb 20 04:44:49 localhost ceph-mon[288586]: Removing monitor np0005625203 from monmap... Feb 20 04:44:49 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon rm", "name": "np0005625203"} : dispatch Feb 20 04:44:49 localhost ceph-mon[288586]: Removing daemon mon.np0005625203 from np0005625203.localdomain -- ports [] Feb 20 04:44:49 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:44:49 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:44:49 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:44:49 localhost ceph-mon[288586]: Health check failed: 1/3 mons down, quorum np0005625201,np0005625204 (MON_DOWN) Feb 20 04:44:49 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:44:49 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:44:49 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625202 in quorum (ranks 0,1,2) Feb 20 04:44:49 localhost ceph-mon[288586]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625201,np0005625204) Feb 20 04:44:49 localhost ceph-mon[288586]: Cluster is now healthy Feb 20 04:44:49 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:44:49 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:49 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:49 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:49 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:44:49 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:44:49 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.590 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.592 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11789MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.593 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.593 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.659 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:44:49 localhost nova_compute[281288]: 2026-02-20 09:44:49.697 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:44:50 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:44:50 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1776286500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.117 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.123 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.140 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.143 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.143 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:44:50 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:50 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:50 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:50 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:50 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:50 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.826 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:50 localhost nova_compute[281288]: 2026-02-20 09:44:50.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:51 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)... Feb 20 04:44:51 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain Feb 20 04:44:51 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:51 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:51 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:51 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.144 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.144 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.145 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.145 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.397 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.398 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.398 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.399 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:44:52 localhost ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)... Feb 20 04:44:52 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain Feb 20 04:44:52 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:52 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:52 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:52 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.754 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.774 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.775 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.775 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.777 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.777 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:44:52 localhost nova_compute[281288]: 2026-02-20 09:44:52.778 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:44:53 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:44:53 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:44:53 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:54 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:54 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:44:54 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:44:54 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:54 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:44:55 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:44:55 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:55 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:55 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:44:55 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:44:55 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:44:55 localhost nova_compute[281288]: 2026-02-20 09:44:55.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:55 localhost nova_compute[281288]: 2026-02-20 09:44:55.829 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:44:55 localhost nova_compute[281288]: 2026-02-20 09:44:55.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:44:55 localhost nova_compute[281288]: 2026-02-20 09:44:55.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:55 localhost nova_compute[281288]: 2026-02-20 09:44:55.864 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:44:55 localhost nova_compute[281288]: 2026-02-20 09:44:55.865 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:44:56 localhost openstack_network_exporter[244414]: ERROR 09:44:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:44:56 localhost openstack_network_exporter[244414]: Feb 20 04:44:56 localhost openstack_network_exporter[244414]: ERROR 09:44:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:44:56 localhost openstack_network_exporter[244414]: Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:56 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:44:56 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:44:56 localhost ceph-mon[288586]: Deploying daemon mon.np0005625203 on np0005625203.localdomain Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:56 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:44:57 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:44:57 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:44:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:57 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:57 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:44:58 localhost podman[296459]: 2026-02-20 09:44:58.150045402 +0000 UTC m=+0.083918884 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:44:58 localhost podman[296459]: 2026-02-20 09:44:58.158596925 +0000 UTC m=+0.092470377 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute) Feb 20 04:44:58 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:44:58 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:44:58 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:44:58 localhost ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:44:58 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:44:58 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:59 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:44:59 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:59 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:59 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:44:59 localhost ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:44:59 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:44:59 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:45:00 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:00 localhost nova_compute[281288]: 2026-02-20 09:45:00.865 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:00 localhost nova_compute[281288]: 2026-02-20 09:45:00.866 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:00 localhost nova_compute[281288]: 2026-02-20 09:45:00.867 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:00 localhost nova_compute[281288]: 2026-02-20 09:45:00.867 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:00 localhost nova_compute[281288]: 2026-02-20 09:45:00.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:00 localhost nova_compute[281288]: 2026-02-20 09:45:00.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:01 localhost ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:45:01 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:45:01 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:45:01 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:01 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:45:02 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:45:02 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:45:02 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:45:03 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:45:03 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:45:03 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:03 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:03 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:45:03 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:45:03 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:03 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:03 localhost podman[296532]: Feb 20 04:45:03 localhost podman[296532]: 2026-02-20 09:45:03.651742568 +0000 UTC m=+0.068195250 container create c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, release=1770267347, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:45:03 localhost systemd[1]: Started libpod-conmon-c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c.scope. Feb 20 04:45:03 localhost podman[296532]: 2026-02-20 09:45:03.621741599 +0000 UTC m=+0.038194341 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:03 localhost systemd[1]: Started libcrun container. Feb 20 04:45:03 localhost podman[296532]: 2026-02-20 09:45:03.742261559 +0000 UTC m=+0.158714271 container init c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=1770267347, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7) Feb 20 04:45:03 localhost podman[296532]: 2026-02-20 09:45:03.753512667 +0000 UTC m=+0.169965379 container start c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.42.2, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux , release=1770267347) Feb 20 04:45:03 localhost podman[296532]: 2026-02-20 09:45:03.753874757 +0000 UTC m=+0.170327529 container attach c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, version=7, release=1770267347) Feb 20 04:45:03 localhost zealous_napier[296547]: 167 167 Feb 20 04:45:03 localhost systemd[1]: libpod-c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c.scope: Deactivated successfully. Feb 20 04:45:03 localhost podman[296532]: 2026-02-20 09:45:03.758405285 +0000 UTC m=+0.174858027 container died c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:45:03 localhost podman[296553]: 2026-02-20 09:45:03.864229799 +0000 UTC m=+0.093515646 container remove c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, io.buildah.version=1.42.2, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-type=git, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1770267347, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Feb 20 04:45:03 localhost systemd[1]: libpod-conmon-c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c.scope: Deactivated successfully. Feb 20 04:45:04 localhost ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:45:04 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:45:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:04 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:04 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:45:04 localhost podman[296622]: Feb 20 04:45:04 localhost podman[296622]: 2026-02-20 09:45:04.611788316 +0000 UTC m=+0.082580376 container create 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.42.2, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347) Feb 20 04:45:04 localhost systemd[1]: Started libpod-conmon-81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485.scope. Feb 20 04:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-243ff6385feb9fe0152935974108027b4116b4fd6887fd9fc30f44497ac86910-merged.mount: Deactivated successfully. Feb 20 04:45:04 localhost systemd[1]: Started libcrun container. Feb 20 04:45:04 localhost podman[296622]: 2026-02-20 09:45:04.580070069 +0000 UTC m=+0.050862169 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:04 localhost podman[296622]: 2026-02-20 09:45:04.687538429 +0000 UTC m=+0.158330489 container init 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public) Feb 20 04:45:04 localhost systemd[1]: tmp-crun.IJ4vwV.mount: Deactivated successfully. Feb 20 04:45:04 localhost podman[296622]: 2026-02-20 09:45:04.700821975 +0000 UTC m=+0.171614045 container start 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, release=1770267347, io.openshift.tags=rhceph ceph) Feb 20 04:45:04 localhost podman[296622]: 2026-02-20 09:45:04.701171735 +0000 UTC m=+0.171963835 container attach 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, RELEASE=main) Feb 20 04:45:04 localhost recursing_meninsky[296637]: 167 167 Feb 20 04:45:04 localhost systemd[1]: libpod-81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485.scope: Deactivated successfully. Feb 20 04:45:04 localhost podman[296622]: 2026-02-20 09:45:04.704943101 +0000 UTC m=+0.175735201 container died 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.buildah.version=1.42.2, RELEASE=main, GIT_BRANCH=main, version=7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-f3047779d42050ad8bd16eda4af33a2351292539bb0a987173088378b2d9e83e-merged.mount: Deactivated successfully. Feb 20 04:45:04 localhost podman[296642]: 2026-02-20 09:45:04.805692721 +0000 UTC m=+0.086882158 container remove 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:45:04 localhost systemd[1]: libpod-conmon-81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485.scope: Deactivated successfully. Feb 20 04:45:05 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:05 localhost ceph-mon[288586]: Reconfiguring osd.0 (monmap changed)... Feb 20 04:45:05 localhost ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:45:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:05 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:05 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:45:05 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:05 localhost podman[296717]: Feb 20 04:45:05 localhost podman[296717]: 2026-02-20 09:45:05.697575412 +0000 UTC m=+0.083464762 container create a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vendor=Red Hat, Inc., vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, name=rhceph) Feb 20 04:45:05 localhost systemd[1]: Started libpod-conmon-a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78.scope. Feb 20 04:45:05 localhost systemd[1]: Started libcrun container. Feb 20 04:45:05 localhost podman[296717]: 2026-02-20 09:45:05.665053662 +0000 UTC m=+0.050943072 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:05 localhost podman[296717]: 2026-02-20 09:45:05.774330593 +0000 UTC m=+0.160219933 container init a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1770267347, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.buildah.version=1.42.2, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True) Feb 20 04:45:05 localhost podman[296717]: 2026-02-20 09:45:05.78942172 +0000 UTC m=+0.175311070 container start a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True) Feb 20 04:45:05 localhost podman[296717]: 2026-02-20 09:45:05.789776121 +0000 UTC m=+0.175665471 container attach a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.expose-services=, distribution-scope=public, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux ) Feb 20 04:45:05 localhost crazy_cerf[296732]: 167 167 Feb 20 04:45:05 localhost systemd[1]: libpod-a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78.scope: Deactivated successfully. Feb 20 04:45:05 localhost podman[296717]: 2026-02-20 09:45:05.792608731 +0000 UTC m=+0.178498081 container died a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:45:05 localhost sshd[296749]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:45:05 localhost sshd[296751]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:45:05 localhost podman[296737]: 2026-02-20 09:45:05.882226986 +0000 UTC m=+0.082206347 container remove a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, build-date=2026-02-09T10:25:24Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, release=1770267347, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:45:05 localhost systemd[1]: libpod-conmon-a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78.scope: Deactivated successfully. Feb 20 04:45:05 localhost nova_compute[281288]: 2026-02-20 09:45:05.906 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:05 localhost nova_compute[281288]: 2026-02-20 09:45:05.910 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:05 localhost nova_compute[281288]: 2026-02-20 09:45:05.910 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:05 localhost nova_compute[281288]: 2026-02-20 09:45:05.910 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:05 localhost nova_compute[281288]: 2026-02-20 09:45:05.950 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:05 localhost nova_compute[281288]: 2026-02-20 09:45:05.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:45:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:45:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:45:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:45:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:45:06.011 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:45:06 localhost sshd[296801]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:45:06 localhost podman[296781]: 2026-02-20 09:45:06.333613885 +0000 UTC m=+0.089244816 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:45:06 localhost podman[296781]: 2026-02-20 09:45:06.344615396 +0000 UTC m=+0.100246347 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:45:06 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:45:06 localhost ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)... Feb 20 04:45:06 localhost ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:45:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:06 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:45:06 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:45:06 localhost systemd[1]: tmp-crun.NWRdsv.mount: Deactivated successfully. Feb 20 04:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-c28de7092b6016f41a23dbffffb648be93dfe59aed1724101236e677c0d4345b-merged.mount: Deactivated successfully. Feb 20 04:45:06 localhost podman[296839]: Feb 20 04:45:06 localhost podman[296839]: 2026-02-20 09:45:06.833566528 +0000 UTC m=+0.075126877 container create d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, maintainer=Guillaume Abrioux ) Feb 20 04:45:06 localhost systemd[1]: Started libpod-conmon-d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309.scope. Feb 20 04:45:06 localhost systemd[1]: Started libcrun container. Feb 20 04:45:06 localhost podman[296839]: 2026-02-20 09:45:06.907317475 +0000 UTC m=+0.148877824 container init d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Feb 20 04:45:06 localhost podman[296839]: 2026-02-20 09:45:06.808789887 +0000 UTC m=+0.050350276 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:06 localhost podman[296839]: 2026-02-20 09:45:06.918904112 +0000 UTC m=+0.160464471 container start d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux , version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:45:06 localhost podman[296839]: 2026-02-20 09:45:06.919155549 +0000 UTC m=+0.160715918 container attach d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, version=7, ceph=True, io.openshift.tags=rhceph ceph) Feb 20 04:45:06 localhost interesting_diffie[296854]: 167 167 Feb 20 04:45:06 localhost systemd[1]: libpod-d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309.scope: Deactivated successfully. Feb 20 04:45:06 localhost podman[296839]: 2026-02-20 09:45:06.924430899 +0000 UTC m=+0.165991308 container died d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1770267347, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2) Feb 20 04:45:07 localhost podman[296859]: 2026-02-20 09:45:07.056467114 +0000 UTC m=+0.123397302 container remove d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., distribution-scope=public, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:45:07 localhost systemd[1]: libpod-conmon-d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309.scope: Deactivated successfully. Feb 20 04:45:07 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:45:07 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:45:07 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:07 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:07 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:45:07 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:45:07 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:07 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:07 localhost systemd[1]: var-lib-containers-storage-overlay-a4179095fe9a7ad288376c95fe3a3aabd501a0c0f28ed0d3d3a5a9a945c1eb6c-merged.mount: Deactivated successfully. Feb 20 04:45:07 localhost podman[296929]: Feb 20 04:45:07 localhost podman[296929]: 2026-02-20 09:45:07.819093847 +0000 UTC m=+0.069570499 container create 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, vcs-type=git, name=rhceph, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:45:07 localhost systemd[1]: Started libpod-conmon-13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879.scope. Feb 20 04:45:07 localhost systemd[1]: Started libcrun container. Feb 20 04:45:07 localhost podman[296929]: 2026-02-20 09:45:07.881722428 +0000 UTC m=+0.132199040 container init 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2026-02-09T10:25:24Z, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1770267347, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, distribution-scope=public) Feb 20 04:45:07 localhost podman[296929]: 2026-02-20 09:45:07.892441332 +0000 UTC m=+0.142917954 container start 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main) Feb 20 04:45:07 localhost podman[296929]: 2026-02-20 09:45:07.892594356 +0000 UTC m=+0.143070968 container attach 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, distribution-scope=public, version=7, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, ceph=True, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1770267347) Feb 20 04:45:07 localhost podman[296929]: 2026-02-20 09:45:07.793287777 +0000 UTC m=+0.043764459 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:07 localhost affectionate_johnson[296944]: 167 167 Feb 20 04:45:07 localhost systemd[1]: libpod-13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879.scope: Deactivated successfully. Feb 20 04:45:07 localhost podman[296949]: 2026-02-20 09:45:07.964312405 +0000 UTC m=+0.056836658 container died 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:45:07 localhost podman[296949]: 2026-02-20 09:45:07.999690405 +0000 UTC m=+0.092214598 container remove 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, RELEASE=main, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:45:08 localhost systemd[1]: libpod-conmon-13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879.scope: Deactivated successfully. Feb 20 04:45:08 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:45:08 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:45:08 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:08 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:08 localhost systemd[1]: tmp-crun.ZfCamy.mount: Deactivated successfully. Feb 20 04:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-f324a384ad3aaa47e636805019c1f363abeaa08ce5bdbfa356d3180ce18ba22f-merged.mount: Deactivated successfully. Feb 20 04:45:09 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:10 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:10 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:10 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:10 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:45:10 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:10 localhost nova_compute[281288]: 2026-02-20 09:45:10.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:10 localhost nova_compute[281288]: 2026-02-20 09:45:10.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:10 localhost nova_compute[281288]: 2026-02-20 09:45:10.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:10 localhost nova_compute[281288]: 2026-02-20 09:45:10.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:10 localhost nova_compute[281288]: 2026-02-20 09:45:10.988 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:10 localhost nova_compute[281288]: 2026-02-20 09:45:10.989 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:11 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:11 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:45:13 localhost podman[297050]: 2026-02-20 09:45:13.16520293 +0000 UTC m=+0.099515776 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:45:13 localhost podman[297050]: 2026-02-20 09:45:13.177170929 +0000 UTC m=+0.111483805 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:45:13 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:45:13 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:14 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:15 localhost podman[297074]: 2026-02-20 09:45:15.151801319 +0000 UTC m=+0.089879534 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:45:15 localhost podman[297074]: 2026-02-20 09:45:15.196274687 +0000 UTC m=+0.134352932 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, version=9.7, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible) Feb 20 04:45:15 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:45:15 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:15 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:15 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:15 localhost nova_compute[281288]: 2026-02-20 09:45:15.990 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:15 localhost nova_compute[281288]: 2026-02-20 09:45:15.992 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:15 localhost nova_compute[281288]: 2026-02-20 09:45:15.992 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:15 localhost nova_compute[281288]: 2026-02-20 09:45:15.993 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:16 localhost nova_compute[281288]: 2026-02-20 09:45:16.029 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:16 localhost nova_compute[281288]: 2026-02-20 09:45:16.030 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:16 localhost ceph-mon[288586]: Reconfig service osd.default_drive_group Feb 20 04:45:16 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 e89: 6 total, 6 up, 6 in Feb 20 04:45:16 localhost systemd[1]: session-68.scope: Deactivated successfully. Feb 20 04:45:16 localhost systemd[1]: session-68.scope: Consumed 18.697s CPU time. Feb 20 04:45:16 localhost systemd-logind[759]: Session 68 logged out. Waiting for processes to exit. Feb 20 04:45:16 localhost systemd-logind[759]: Removed session 68. Feb 20 04:45:16 localhost sshd[297095]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:45:17 localhost systemd-logind[759]: New session 69 of user ceph-admin. Feb 20 04:45:17 localhost systemd[1]: Started Session 69 of User ceph-admin. Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' Feb 20 04:45:17 localhost ceph-mon[288586]: from='client.? 172.18.0.200:0/2448153276' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: Activating manager daemon np0005625203.lonygy Feb 20 04:45:17 localhost ceph-mon[288586]: from='client.? 172.18.0.200:0/2448153276' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 20 04:45:17 localhost ceph-mon[288586]: Manager daemon np0005625203.lonygy is now available Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: removing stray HostCache host record np0005625200.localdomain.devices.0 Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"}]': finished Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"}]': finished Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/mirror_snapshot_schedule"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/mirror_snapshot_schedule"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/trash_purge_schedule"} : dispatch Feb 20 04:45:17 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/trash_purge_schedule"} : dispatch Feb 20 04:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:45:17 localhost systemd[1]: tmp-crun.ntW3B7.mount: Deactivated successfully. Feb 20 04:45:17 localhost podman[297118]: 2026-02-20 09:45:17.394886363 +0000 UTC m=+0.099872546 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:45:17 localhost systemd[1]: tmp-crun.9YT7Kd.mount: Deactivated successfully. Feb 20 04:45:17 localhost podman[297117]: 2026-02-20 09:45:17.432024364 +0000 UTC m=+0.140725663 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:45:17 localhost podman[297117]: 2026-02-20 09:45:17.437082996 +0000 UTC m=+0.145784315 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 20 04:45:17 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:45:17 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:17 localhost podman[297118]: 2026-02-20 09:45:17.478592081 +0000 UTC m=+0.183578264 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2) Feb 20 04:45:17 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:45:17 localhost podman[241968]: time="2026-02-20T09:45:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:45:17 localhost podman[241968]: @ - - [20/Feb/2026:09:45:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:45:17 localhost podman[241968]: @ - - [20/Feb/2026:09:45:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18260 "" "Go-http-client/1.1" Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.207 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.238 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.239 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367fc62c-6498-45e7-afb6-ab87b48faf43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.208821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd854e26-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'a8db71b9a2c7b8c85e64334e074e205fd69d673cb5a1a2621c68e8e8593ad0fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.208821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd856532-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'adba15823ed78bc62f98a5c63d35da7222e5babbebe49867e6edd58042daeb6e'}]}, 'timestamp': '2026-02-20 09:45:18.240108', '_unique_id': 'eede449e3288439d8c22de8ae1937dcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.244 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '877cf184-b1df-4a7b-94f1-b4da196b9856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.243490', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd85ff38-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '766cc0fc2bf26313c9a081f016dd903bba7d3b0e61ae893e8f1af7527d1dc825'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.243490', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd861284-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '7584f571f0678c5073e2245d53d4c987f2df830dd6bca80902aeb16ccd23c1d0'}]}, 'timestamp': '2026-02-20 09:45:18.244558', '_unique_id': '5d6b25a127d042ce966b7d55ce70f54a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.246 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.247 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f714f747-e9c9-43cf-8af9-52da11b86a68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.246810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd867cce-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'da1800765ee8e32361ba6d48928a26b45eb26adb729232c7f356975c4606a07e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.246810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd868d0e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'f50052b7e7b1ee4a2351d78711809bbbe87a10b8210e64ffc4fe2af4de3e4597'}]}, 'timestamp': '2026-02-20 09:45:18.247738', '_unique_id': 'c59516deecee4e2cbcbc1f3eb286fffc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.250 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbba8e16-2b67-4de1-9e29-9e1d81295e7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:45:18.250202', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'dd89d194-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.507337117, 'message_signature': '5d3ac0d1646d56e6246a044f6124cd30f94341be8db9ccad3edb448315e06710'}]}, 'timestamp': '2026-02-20 09:45:18.269138', '_unique_id': '2ad34cd50c554ff89983eee30cb12979'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a531ba0c-39be-4d1e-a90d-d724949dd0e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.272076', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8b42d6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '2f8f3e92190b89e4c69537318ce23ee564c808579a8136cb949b9ecc820a7f10'}]}, 'timestamp': '2026-02-20 09:45:18.278571', '_unique_id': 'ec560b0fc1654ba6a85e655679936fb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a946216a-e10a-4a98-ba0d-c5dc7feab83d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.280862', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8bb248-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'c631bf36fec03d7e485d40cdfdd6d47ef81120b8b3fd38b450e0f460f4a39460'}]}, 'timestamp': '2026-02-20 09:45:18.281489', '_unique_id': 'bb6af1931abe4813a95d5c3063d15573'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.283 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd209d69e-9114-424a-af25-b206f06cd3fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.283947', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8c27c8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'c1bd8821ac47dc7a315d5263e72bd74c627eb3f756ba0220b4a2a8b298501021'}]}, 'timestamp': '2026-02-20 09:45:18.284419', '_unique_id': '61546c3517ae4e2e9d445807c9ce916b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.286 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47232825-9ee3-41c5-b62c-87e4c8a3c5b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.286575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd8c8fba-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '5a384b0eba87ea4ebd4e4208a83bba05198965125897b0305ceabd925b120c04'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.286575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd8ca004-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'dd3e2889dac1bc0acf7e55a58c81a6cc721d8c991b91052e9f2802973b080a03'}]}, 'timestamp': '2026-02-20 09:45:18.287463', '_unique_id': '4f29714967894f1a913d9e4907f33d03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f05462-ec54-45ee-a335-ce207bf07852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.289959', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8d1250-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'f08c5f611addf5412580c4d3b5aa2d92d549071ba0237dbbb4de07d048f1c1f2'}]}, 'timestamp': '2026-02-20 09:45:18.290417', '_unique_id': 'bbccbad1ec9445d1a49db069a424dde3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af5ccc52-2891-4a52-b387-3391180a0bdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.292559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd8eedbe-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '98a6d5d8c244c564537f259518bba9f5a1f85538a019b3f18afd49ac92bfa07b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.292559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd8effac-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '6fe982bce55c8dcc990deaa1ab71a49fe3066fd6b45c12bcd4b4d595dcab7269'}]}, 'timestamp': '2026-02-20 09:45:18.303020', '_unique_id': '33f481cb1c494fbdb38bd8605145112a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.305 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.305 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '971062ab-6e5b-4128-9c15-6ec56f10ae3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.305406', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8f6f96-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '8408b71cf6e14d29a0579e1803832c7275e75e815c93a20a140a448fd48365d0'}]}, 'timestamp': '2026-02-20 09:45:18.305922', '_unique_id': '1536b7e96eaf4342bc742bdb98ac0242'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f2469ec-08c4-44e2-a127-2a90a88b6262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.308172', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8fda26-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '46666ff1bf222c867dcf556f4abebb9a2c64f556855d8dfd0d2ae6dc4fbe6697'}]}, 'timestamp': '2026-02-20 09:45:18.308680', '_unique_id': '9eb1536c9ad344f895143db909a522d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 14610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ad704ad-1afe-4c3c-b4d1-bd18632bcca2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14610000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:45:18.311144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'dd905050-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.507337117, 'message_signature': '040282d18e07eb00485c4f3f32850a2b306ef76718a49b0d0e8ec5b214fe945d'}]}, 'timestamp': '2026-02-20 09:45:18.311782', '_unique_id': 'c82f0e93b1b243529ef77b0f3b6da6a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec6e6830-38e8-4580-ba05-0f27f3c0b949', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.314080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd90c918-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': 'f639e7d6ce34d62c5280bbce8a1415619665da19dd5b8f206c40c0c0cd8a2190'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.314080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd90dc82-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '58ff575fa5897a37a063c91a2bf7d6c933cc904c7ad755f7911405291bc00bd0'}]}, 'timestamp': '2026-02-20 09:45:18.315228', '_unique_id': '63a2bcad244a4b5a8a67bd2646bc5c6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa7f4b46-6f1d-462a-87bc-11c21c251517', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.317559', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd914a50-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '039002ba55de6a585ccad887c9c63c85a23cbffd8c7ef456ea6d74051a5ffef8'}]}, 'timestamp': '2026-02-20 09:45:18.318065', '_unique_id': '92a801b93531412285d7eb6ba7de6c29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6897e52-ff21-4892-b271-8f82f4a41c04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.320195', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd91af5e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '9317671077ef1ac99fcbb069349a4ab7d41222a6d0b798002731f8e8dad05c29'}]}, 'timestamp': '2026-02-20 09:45:18.320696', '_unique_id': 'dbc4ade550e049268db9e4f5e3f39faa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16be9827-dddc-46c0-abe0-5bfa72d9fdb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.322924', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd9219f8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'f081ef4b7dec0cb0a2b30aaabe39b3a85d8fff7152114317b15e70e507a831d4'}]}, 'timestamp': '2026-02-20 09:45:18.323380', '_unique_id': '01e703f9e23c44fc93ebdff522e1861a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b691dc05-73d3-4f7f-bf5f-0e5aa92e3617', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.325663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd928564-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'a439bcbf775aab2478f8a367fc2fe72dba63dd255552e9e206c5bdfc53eff876'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.325663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd929590-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '75d28e24803291058f53a9ace8488fbaaf92c72c2ac8a08e0e49d844e4b7c6c3'}]}, 'timestamp': '2026-02-20 09:45:18.326518', '_unique_id': 'b1bfe8c553744d4d8e7104e4c55bdd04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost podman[297247]: 2026-02-20 09:45:18.329287296 +0000 UTC m=+0.128295971 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, architecture=x86_64, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1770267347, CEPH_POINT_RELEASE=) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26959704-3b5a-4ca8-a915-f0c3e1599bf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.328556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd92f3be-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '2eb9d1a7432b4abac7245da59afb90b0e0c72c41ce2ee8ff395ccc60e033bf70'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.328556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd92fd6e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '9dba557492af9f0267f31df94764b703e9f88f0382b5ed02da695bb450487a21'}]}, 'timestamp': '2026-02-20 09:45:18.329097', '_unique_id': 'da8ccdfb5ff64f97b04c7151b0cc57d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3108811e-4e75-4327-b4aa-4d93cc8f7a32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.330389', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd933a2c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '6bb3a3280ba5cd0f3fd3e97a42b3298204dff36b487cefeaaf65a30a7ff11a05'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.330389', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd9344cc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'f6921ebb47d94ede52369ee953f41062723a46b034311e04350df62e853ac3ee'}]}, 'timestamp': '2026-02-20 09:45:18.330926', '_unique_id': 'ded9b9d8de2e4147b885f551fbc92f96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5326f92-3236-405d-b57b-d9d65f551b18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.332232', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd93825c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '1eddd0a9ed2ad04b5393972659187f7c3bb2bfe4677cd72ef611bc90a3bd8680'}]}, 'timestamp': '2026-02-20 09:45:18.332516', '_unique_id': 'be9e124a95d94ffdaafa39610df62120'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:45:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:45:18 localhost podman[297247]: 2026-02-20 09:45:18.433523704 +0000 UTC m=+0.232532389 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph) Feb 20 04:45:19 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:19 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:19 localhost ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Bus STARTING Feb 20 04:45:19 localhost ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Serving on http://172.18.0.107:8765 Feb 20 04:45:19 localhost ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Serving on https://172.18.0.107:7150 Feb 20 04:45:19 localhost ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Bus STARTED Feb 20 04:45:19 localhost ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Client ('172.18.0.107', 58052) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:20 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:21 localhost nova_compute[281288]: 2026-02-20 09:45:21.030 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:21 localhost nova_compute[281288]: 2026-02-20 09:45:21.033 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:21 localhost nova_compute[281288]: 2026-02-20 09:45:21.033 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:21 localhost nova_compute[281288]: 2026-02-20 09:45:21.034 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:21 localhost nova_compute[281288]: 2026-02-20 09:45:21.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:21 localhost nova_compute[281288]: 2026-02-20 09:45:21.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:45:21 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:45:21 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:45:21 localhost ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:45:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:45:21 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:21 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:21 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:21 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:21 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:21 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:22 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:22 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:22 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:22 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:45:23 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:24 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:24 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:45:25 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:25 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:25 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:26 localhost nova_compute[281288]: 2026-02-20 09:45:26.067 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:26 localhost nova_compute[281288]: 2026-02-20 09:45:26.068 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:26 localhost nova_compute[281288]: 2026-02-20 09:45:26.068 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:26 localhost nova_compute[281288]: 2026-02-20 09:45:26.069 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:26 localhost nova_compute[281288]: 2026-02-20 09:45:26.101 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:26 localhost nova_compute[281288]: 2026-02-20 09:45:26.101 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:26 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:45:26 localhost openstack_network_exporter[244414]: ERROR 09:45:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:45:26 localhost openstack_network_exporter[244414]: Feb 20 04:45:26 localhost openstack_network_exporter[244414]: ERROR 09:45:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:45:26 localhost openstack_network_exporter[244414]: Feb 20 04:45:27 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:27 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:27 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:27 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:27 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:45:27 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:45:27 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:27 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:27 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:28 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:28 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:28 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:28 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:28 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:45:28 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:45:29 localhost podman[298146]: 2026-02-20 09:45:29.157776715 +0000 UTC m=+0.087470845 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 20 04:45:29 localhost podman[298146]: 2026-02-20 09:45:29.19718996 +0000 UTC m=+0.126884060 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 20 04:45:29 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:45:29 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:29 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:29 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:29 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:29 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:45:29 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:45:29 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:29 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:30 localhost podman[298217]: Feb 20 04:45:30 localhost podman[298217]: 2026-02-20 09:45:30.089468122 +0000 UTC m=+0.074519190 container create 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2) Feb 20 04:45:30 localhost systemd[1]: Started libpod-conmon-99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5.scope. Feb 20 04:45:30 localhost systemd[1]: Started libcrun container. Feb 20 04:45:30 localhost podman[298217]: 2026-02-20 09:45:30.059875124 +0000 UTC m=+0.044926222 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:30 localhost podman[298217]: 2026-02-20 09:45:30.161591042 +0000 UTC m=+0.146642110 container init 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, distribution-scope=public, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.openshift.expose-services=, release=1770267347, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:45:30 localhost podman[298217]: 2026-02-20 09:45:30.17281067 +0000 UTC m=+0.157861728 container start 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, io.buildah.version=1.42.2, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, RELEASE=main, CEPH_POINT_RELEASE=) Feb 20 04:45:30 localhost podman[298217]: 2026-02-20 09:45:30.173142909 +0000 UTC m=+0.158193977 container attach 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, maintainer=Guillaume Abrioux , RELEASE=main, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph) Feb 20 04:45:30 localhost strange_kowalevski[298233]: 167 167 Feb 20 04:45:30 localhost systemd[1]: libpod-99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5.scope: Deactivated successfully. Feb 20 04:45:30 localhost podman[298217]: 2026-02-20 09:45:30.176474423 +0000 UTC m=+0.161525521 container died 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1770267347, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.42.2, name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:45:30 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-4e26cf461a5b57e070504f6098c3d69cb97fbde0e08a3c5b6e0107a6ec968115-merged.mount: Deactivated successfully. Feb 20 04:45:30 localhost podman[298238]: 2026-02-20 09:45:30.283516541 +0000 UTC m=+0.095295337 container remove 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, architecture=x86_64, release=1770267347, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:45:30 localhost systemd[1]: libpod-conmon-99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5.scope: Deactivated successfully. Feb 20 04:45:30 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:30 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:30 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:30 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:30 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:45:30 localhost ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:45:31 localhost nova_compute[281288]: 2026-02-20 09:45:31.102 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:31 localhost nova_compute[281288]: 2026-02-20 09:45:31.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:31 localhost nova_compute[281288]: 2026-02-20 09:45:31.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:31 localhost nova_compute[281288]: 2026-02-20 09:45:31.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:31 localhost nova_compute[281288]: 2026-02-20 09:45:31.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:31 localhost nova_compute[281288]: 2026-02-20 09:45:31.134 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:31 localhost podman[298313]: Feb 20 04:45:31 localhost podman[298313]: 2026-02-20 09:45:31.214797755 +0000 UTC m=+0.062801587 container create 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:45:31 localhost systemd[1]: Started libpod-conmon-3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5.scope. Feb 20 04:45:31 localhost systemd[1]: Started libcrun container. Feb 20 04:45:31 localhost podman[298313]: 2026-02-20 09:45:31.269177744 +0000 UTC m=+0.117181576 container init 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, ceph=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:45:31 localhost podman[298313]: 2026-02-20 09:45:31.277597352 +0000 UTC m=+0.125601184 container start 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 20 04:45:31 localhost podman[298313]: 2026-02-20 09:45:31.277832139 +0000 UTC m=+0.125835981 container attach 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347, io.buildah.version=1.42.2, io.openshift.expose-services=, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7) Feb 20 04:45:31 localhost musing_murdock[298328]: 167 167 Feb 20 04:45:31 localhost systemd[1]: libpod-3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5.scope: Deactivated successfully. Feb 20 04:45:31 localhost podman[298313]: 2026-02-20 09:45:31.282057398 +0000 UTC m=+0.130061310 container died 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, vendor=Red Hat, Inc., release=1770267347, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:45:31 localhost podman[298313]: 2026-02-20 09:45:31.193996217 +0000 UTC m=+0.042000039 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:31 localhost podman[298333]: 2026-02-20 09:45:31.373015621 +0000 UTC m=+0.081848416 container remove 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph) Feb 20 04:45:31 localhost systemd[1]: libpod-conmon-3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5.scope: Deactivated successfully. Feb 20 04:45:31 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:31 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:31 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:31 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:31 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:45:31 localhost ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:45:31 localhost ceph-mon[288586]: Saving service mon spec with placement label:mon Feb 20 04:45:31 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:31 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:31 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:32 localhost podman[298409]: Feb 20 04:45:32 localhost podman[298409]: 2026-02-20 09:45:32.187839482 +0000 UTC m=+0.075569999 container create aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, release=1770267347, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:45:32 localhost systemd[1]: Started libpod-conmon-aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e.scope. Feb 20 04:45:32 localhost systemd[1]: var-lib-containers-storage-overlay-3e84cb3051dbbf5b8837dc91ec5eea1c25d8cb305defe352c3a2a0e0e26fe4c6-merged.mount: Deactivated successfully. Feb 20 04:45:32 localhost systemd[1]: Started libcrun container. Feb 20 04:45:32 localhost podman[298409]: 2026-02-20 09:45:32.157680889 +0000 UTC m=+0.045411396 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:45:32 localhost podman[298409]: 2026-02-20 09:45:32.257453412 +0000 UTC m=+0.145183919 container init aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:45:32 localhost podman[298409]: 2026-02-20 09:45:32.266776275 +0000 UTC m=+0.154506802 container start aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, version=7, architecture=x86_64) Feb 20 04:45:32 localhost podman[298409]: 2026-02-20 09:45:32.267099584 +0000 UTC m=+0.154830161 container attach aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=1770267347, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, build-date=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:45:32 localhost recursing_ramanujan[298424]: 167 167 Feb 20 04:45:32 localhost systemd[1]: libpod-aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e.scope: Deactivated successfully. Feb 20 04:45:32 localhost podman[298409]: 2026-02-20 09:45:32.270709196 +0000 UTC m=+0.158439733 container died aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container) Feb 20 04:45:32 localhost podman[298429]: 2026-02-20 09:45:32.3729882 +0000 UTC m=+0.090836211 container remove aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, ceph=True) Feb 20 04:45:32 localhost systemd[1]: libpod-conmon-aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e.scope: Deactivated successfully. Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:32 localhost ceph-mon[288586]: Reconfiguring mon.np0005625204 (monmap changed)... Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:45:32 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:32 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.537510) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732537567, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2849, "num_deletes": 256, "total_data_size": 8905305, "memory_usage": 9185624, "flush_reason": "Manual Compaction"} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732559009, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5327455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17307, "largest_seqno": 20151, "table_properties": {"data_size": 5315499, "index_size": 7501, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 30618, "raw_average_key_size": 22, "raw_value_size": 5289399, "raw_average_value_size": 3920, "num_data_blocks": 326, "num_entries": 1349, "num_filter_entries": 1349, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580670, "oldest_key_time": 1771580670, "file_creation_time": 1771580732, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 21557 microseconds, and 11212 cpu microseconds. Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.559064) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5327455 bytes OK Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.559090) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.561386) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.561408) EVENT_LOG_v1 {"time_micros": 1771580732561401, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.561434) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8891394, prev total WAL file size 8891686, number of live WAL files 2. Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.563555) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(5202KB)], [24(14MB)] Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732563631, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20516503, "oldest_snapshot_seqno": -1} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10934 keys, 18161002 bytes, temperature: kUnknown Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732643726, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 18161002, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18098403, "index_size": 34146, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27397, "raw_key_size": 291577, "raw_average_key_size": 26, "raw_value_size": 17911752, "raw_average_value_size": 1638, "num_data_blocks": 1308, "num_entries": 10934, "num_filter_entries": 10934, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580732, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.644299) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 18161002 bytes Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.646133) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 256.0 rd, 226.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.1, 14.5 +0.0 blob) out(17.3 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 11483, records dropped: 549 output_compression: NoCompression Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.646164) EVENT_LOG_v1 {"time_micros": 1771580732646151, "job": 12, "event": "compaction_finished", "compaction_time_micros": 80140, "compaction_time_cpu_micros": 46405, "output_level": 6, "num_output_files": 1, "total_output_size": 18161002, "num_input_records": 11483, "num_output_records": 10934, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732647338, "job": 12, "event": "table_file_deletion", "file_number": 26} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732650083, "job": 12, "event": "table_file_deletion", "file_number": 24} Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.563434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:45:32 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:45:33 localhost systemd[1]: var-lib-containers-storage-overlay-089952a015ce864469b7c1e49d7055a1e46e3372bdbe6f559830c6fae9a5ec3f-merged.mount: Deactivated successfully. Feb 20 04:45:33 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:33 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:33 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:45:33 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:34 localhost ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)... Feb 20 04:45:34 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:45:34 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:34 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:34 localhost ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:45:34 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:45:34 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:45:35 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:35 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:35 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:35 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:35 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:36 localhost nova_compute[281288]: 2026-02-20 09:45:36.135 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:36 localhost nova_compute[281288]: 2026-02-20 09:45:36.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:36 localhost nova_compute[281288]: 2026-02-20 09:45:36.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:36 localhost nova_compute[281288]: 2026-02-20 09:45:36.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:36 localhost nova_compute[281288]: 2026-02-20 09:45:36.187 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:36 localhost nova_compute[281288]: 2026-02-20 09:45:36.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:45:37 localhost podman[298463]: 2026-02-20 09:45:37.166282447 +0000 UTC m=+0.099953758 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:45:37 localhost podman[298463]: 2026-02-20 09:45:37.176887688 +0000 UTC m=+0.110558979 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:45:37 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:45:37 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 20 04:45:37 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:39 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44580 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Feb 20 04:45:39 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:45:39 localhost ceph-mon[288586]: paxos.1).electionLogic(46) init, last seen epoch 46 Feb 20 04:45:39 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:41 localhost nova_compute[281288]: 2026-02-20 09:45:41.189 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:41 localhost nova_compute[281288]: 2026-02-20 09:45:41.190 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:42 localhost nova_compute[281288]: 2026-02-20 09:45:42.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:42 localhost nova_compute[281288]: 2026-02-20 09:45:42.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 04:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:45:44 localhost systemd[292696]: Starting Mark boot as successful... Feb 20 04:45:44 localhost podman[298486]: 2026-02-20 09:45:44.148626169 +0000 UTC m=+0.090167071 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:45:44 localhost systemd[292696]: Finished Mark boot as successful. Feb 20 04:45:44 localhost podman[298486]: 2026-02-20 09:45:44.162982706 +0000 UTC m=+0.104523568 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:45:44 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625204@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:45:44 localhost ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2,3) Feb 20 04:45:44 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:45:45 localhost ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:45:46 localhost podman[298513]: 2026-02-20 09:45:46.132515752 +0000 UTC m=+0.069791185 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, managed_by=edpm_ansible, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7) Feb 20 04:45:46 localhost podman[298513]: 2026-02-20 09:45:46.146073765 +0000 UTC m=+0.083349228 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, io.openshift.tags=minimal rhel9, release=1770267347) Feb 20 04:45:46 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.193 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.193 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.247 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.248 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:46 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44420 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Feb 20 04:45:46 localhost ceph-mon[288586]: mon.np0005625204@1(peon) e13 my rank is now 0 (was 1) Feb 20 04:45:46 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Feb 20 04:45:46 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Feb 20 04:45:46 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Feb 20 04:45:46 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:45:46 localhost ceph-mon[288586]: paxos.0).electionLogic(50) init, last seen epoch 50 Feb 20 04:45:46 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.733 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.761 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.761 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.762 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.762 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:45:46 localhost nova_compute[281288]: 2026-02-20 09:45:46.762 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:45:47 localhost podman[241968]: time="2026-02-20T09:45:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:45:47 localhost podman[241968]: @ - - [20/Feb/2026:09:45:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:45:47 localhost podman[241968]: @ - - [20/Feb/2026:09:45:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18269 "" "Go-http-client/1.1" Feb 20 04:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:45:48 localhost podman[298546]: 2026-02-20 09:45:48.128139126 +0000 UTC m=+0.067463420 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:45:48 localhost podman[298546]: 2026-02-20 09:45:48.160982745 +0000 UTC m=+0.100307029 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 04:45:48 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:45:48 localhost podman[298545]: 2026-02-20 09:45:48.114771337 +0000 UTC m=+0.055573093 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:45:48 localhost podman[298545]: 2026-02-20 09:45:48.246501694 +0000 UTC m=+0.187303480 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:45:48 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:45:49 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id Feb 20 04:45:49 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id Feb 20 04:45:50 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id Feb 20 04:45:50 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id Feb 20 04:45:51 localhost nova_compute[281288]: 2026-02-20 09:45:51.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:51 localhost nova_compute[281288]: 2026-02-20 09:45:51.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:51 localhost nova_compute[281288]: 2026-02-20 09:45:51.253 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:51 localhost nova_compute[281288]: 2026-02-20 09:45:51.253 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:51 localhost nova_compute[281288]: 2026-02-20 09:45:51.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:51 localhost nova_compute[281288]: 2026-02-20 09:45:51.278 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:51 localhost sshd[298588]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202 in quorum (ranks 0,1) Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 13 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:45:46.327222+0000 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203 Feb 20 04:45:51 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 34s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005625204,np0005625202 (MON_DOWN) Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum np0005625204,np0005625202 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005625204,np0005625202 Feb 20 04:45:51 localhost ceph-mon[288586]: log_channel(cluster) log [WRN] : mon.np0005625203 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Feb 20 04:45:51 localhost ceph-mon[288586]: Remove daemons mon.np0005625201 Feb 20 04:45:51 localhost ceph-mon[288586]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203']) Feb 20 04:45:51 localhost ceph-mon[288586]: Removing monitor np0005625201 from monmap... Feb 20 04:45:51 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625201"} : dispatch Feb 20 04:45:51 localhost ceph-mon[288586]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports [] Feb 20 04:45:51 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:45:51 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:45:51 localhost ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202 in quorum (ranks 0,1) Feb 20 04:45:51 localhost ceph-mon[288586]: Health check failed: 1/3 mons down, quorum np0005625204,np0005625202 (MON_DOWN) Feb 20 04:45:51 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:45:51 localhost ceph-mon[288586]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005625204,np0005625202 Feb 20 04:45:51 localhost ceph-mon[288586]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005625204,np0005625202 Feb 20 04:45:51 localhost ceph-mon[288586]: mon.np0005625203 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Feb 20 04:45:52 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:45:52 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1126955229' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:45:53 localhost ceph-mon[288586]: paxos.0).electionLogic(53) init, last seen epoch 53, mid-election, bumping Feb 20 04:45:53 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2) Feb 20 04:45:53 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:53 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:53 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:53 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 13 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:45:46.327222+0000 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202 Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203 Feb 20 04:45:53 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 37s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625204,np0005625202) Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : Cluster is now healthy Feb 20 04:45:53 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:45:53 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0) Feb 20 04:45:53 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:45:53 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:45:54 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3402059240' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.335 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.401 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.402 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:45:54 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:54 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:54 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:54 localhost ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:45:54 localhost ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2) Feb 20 04:45:54 localhost ceph-mon[288586]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625204,np0005625202) Feb 20 04:45:54 localhost ceph-mon[288586]: Cluster is now healthy Feb 20 04:45:54 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:45:54 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.610 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.612 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11742MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.612 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.613 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.715 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.715 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.716 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.781 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.856 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.856 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.879 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.899 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:45:54 localhost nova_compute[281288]: 2026-02-20 09:45:54.943 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:45:55 localhost ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:45:55 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:45:55 localhost ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1060203723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.402 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.407 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.426 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.427 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.428 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.428 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.428 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.445 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 04:45:55 localhost nova_compute[281288]: 2026-02-20 09:45:55.446 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:55 localhost ceph-mon[288586]: Deploying daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:45:55 localhost ceph-mon[288586]: Removed label mon from host np0005625201.localdomain Feb 20 04:45:55 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 20 04:45:55 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:56 localhost nova_compute[281288]: 2026-02-20 09:45:56.279 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:56 localhost nova_compute[281288]: 2026-02-20 09:45:56.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:45:56 localhost nova_compute[281288]: 2026-02-20 09:45:56.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:45:56 localhost nova_compute[281288]: 2026-02-20 09:45:56.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:56 localhost nova_compute[281288]: 2026-02-20 09:45:56.314 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:45:56 localhost nova_compute[281288]: 2026-02-20 09:45:56.315 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:45:56 localhost openstack_network_exporter[244414]: ERROR 09:45:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:45:56 localhost openstack_network_exporter[244414]: Feb 20 04:45:56 localhost openstack_network_exporter[244414]: ERROR 09:45:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:45:56 localhost openstack_network_exporter[244414]: Feb 20 04:45:56 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:56 localhost ceph-mon[288586]: Removed label mgr from host np0005625201.localdomain Feb 20 04:45:56 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 20 04:45:56 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0) Feb 20 04:45:57 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0) Feb 20 04:45:57 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:45:57 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:45:57 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:45:57 localhost nova_compute[281288]: 2026-02-20 09:45:57.448 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:57 localhost nova_compute[281288]: 2026-02-20 09:45:57.450 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:57 localhost nova_compute[281288]: 2026-02-20 09:45:57.471 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:57 localhost nova_compute[281288]: 2026-02-20 09:45:57.472 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:45:57 localhost nova_compute[281288]: 2026-02-20 09:45:57.472 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(leader).monmap v13 adding/updating np0005625201 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster Feb 20 04:45:57 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44160 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Feb 20 04:45:57 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:45:57 localhost ceph-mon[288586]: paxos.0).electionLogic(56) init, last seen epoch 56 Feb 20 04:45:57 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.106 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.106 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.106 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.107 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.467 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.482 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.482 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.483 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.483 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.484 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.484 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:45:58 localhost nova_compute[281288]: 2026-02-20 09:45:58.485 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:46:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:46:00 localhost systemd[1]: tmp-crun.0BZQmR.mount: Deactivated successfully. Feb 20 04:46:00 localhost podman[298961]: 2026-02-20 09:46:00.143589359 +0000 UTC m=+0.078807271 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 04:46:00 localhost podman[298961]: 2026-02-20 09:46:00.177519028 +0000 UTC m=+0.112736960 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:46:00 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:46:01 localhost nova_compute[281288]: 2026-02-20 09:46:01.316 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:01 localhost nova_compute[281288]: 2026-02-20 09:46:01.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:01 localhost nova_compute[281288]: 2026-02-20 09:46:01.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:01 localhost nova_compute[281288]: 2026-02-20 09:46:01.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:01 localhost nova_compute[281288]: 2026-02-20 09:46:01.351 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:01 localhost nova_compute[281288]: 2026-02-20 09:46:01.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:02 localhost ceph-mon[288586]: paxos.0).electionLogic(57) init, last seen epoch 57, mid-election, bumping Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203,np0005625201 in quorum (ranks 0,1,2,3) Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 14 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:45:57.556107+0000 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203 Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201 Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 46s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0) Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0) Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:02 localhost ceph-mon[288586]: Removed label _admin from host np0005625201.localdomain Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625201 calling monitor election Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203,np0005625201 in quorum (ranks 0,1,2,3) Feb 20 04:46:02 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:46:02 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:02 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:02 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0) Feb 20 04:46:02 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:03 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0) Feb 20 04:46:03 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:03 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:03 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:03 localhost ceph-mon[288586]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:03 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:03 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:03 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:03 localhost ceph-mon[288586]: Removing np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:46:03 localhost ceph-mon[288586]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:46:03 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:03 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:04 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:04 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:04 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:04 localhost sshd[299231]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:46:04 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:04 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:05 localhost ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:46:05 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:05 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:05 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:05 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:05 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:46:05 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:46:05 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 20 04:46:05 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:05 localhost nova_compute[281288]: 2026-02-20 09:46:05.743 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:05 localhost nova_compute[281288]: 2026-02-20 09:46:05.765 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 20 04:46:05 localhost nova_compute[281288]: 2026-02-20 09:46:05.766 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:46:05 localhost nova_compute[281288]: 2026-02-20 09:46:05.767 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:46:05 localhost nova_compute[281288]: 2026-02-20 09:46:05.789 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:46:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:46:06.009 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:46:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:46:06.009 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:46:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:46:06.010 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:46:06 localhost nova_compute[281288]: 2026-02-20 09:46:06.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:06 localhost nova_compute[281288]: 2026-02-20 09:46:06.355 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:06 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:06 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:06 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:06 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:06 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:06 localhost ceph-mon[288586]: Removing daemon mgr.np0005625201.mtnyvu from np0005625201.localdomain -- ports [8765] Feb 20 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.np0005625201.mtnyvu"} v 0) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth rm", "entity": "mgr.np0005625201.mtnyvu"} : dispatch Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005625201.mtnyvu"}]': finished Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:08 localhost systemd[1]: tmp-crun.tgU8Om.mount: Deactivated successfully. Feb 20 04:46:08 localhost podman[299303]: 2026-02-20 09:46:08.170613232 +0000 UTC m=+0.097814587 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command({"prefix": "mon rm", "name": "np0005625201"} v 0) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625201"} : dispatch Feb 20 04:46:08 localhost podman[299303]: 2026-02-20 09:46:08.17794928 +0000 UTC m=+0.105150625 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:46:08 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Feb 20 04:46:08 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:46:08 localhost ceph-mon[288586]: paxos.0).electionLogic(60) init, last seen epoch 60 Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 15 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:46:08.177805+0000 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202 Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203 Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 51s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:08 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 20 04:46:08 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:09 localhost ceph-mon[288586]: Removing key for mgr.np0005625201.mtnyvu Feb 20 04:46:09 localhost ceph-mon[288586]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203']) Feb 20 04:46:09 localhost ceph-mon[288586]: Removing monitor np0005625201 from monmap... Feb 20 04:46:09 localhost ceph-mon[288586]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports [] Feb 20 04:46:09 localhost ceph-mon[288586]: mon.np0005625204 calling monitor election Feb 20 04:46:09 localhost ceph-mon[288586]: mon.np0005625203 calling monitor election Feb 20 04:46:09 localhost ceph-mon[288586]: mon.np0005625202 calling monitor election Feb 20 04:46:09 localhost ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2) Feb 20 04:46:09 localhost ceph-mon[288586]: overall HEALTH_OK Feb 20 04:46:09 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:09 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:09 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:46:09 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:09 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:46:09 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:10 localhost ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.257847) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770257950, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1216, "num_deletes": 255, "total_data_size": 1196283, "memory_usage": 1228576, "flush_reason": "Manual Compaction"} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770266532, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 943693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20156, "largest_seqno": 21367, "table_properties": {"data_size": 938360, "index_size": 2484, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15099, "raw_average_key_size": 21, "raw_value_size": 925920, "raw_average_value_size": 1309, "num_data_blocks": 103, "num_entries": 707, "num_filter_entries": 707, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580732, "oldest_key_time": 1771580732, "file_creation_time": 1771580770, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 8739 microseconds, and 4967 cpu microseconds. Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.266597) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 943693 bytes OK Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.266626) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268284) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268304) EVENT_LOG_v1 {"time_micros": 1771580770268298, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268325) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1189976, prev total WAL file size 1189976, number of live WAL files 2. Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268930) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323733' seq:72057594037927935, type:22 .. '6B760031353239' seq:0, type:0; will stop at (end) Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(921KB)], [27(17MB)] Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770268969, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 19104695, "oldest_snapshot_seqno": -1} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11094 keys, 18043053 bytes, temperature: kUnknown Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770342331, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 18043053, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17980147, "index_size": 34069, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 297635, "raw_average_key_size": 26, "raw_value_size": 17791113, "raw_average_value_size": 1603, "num_data_blocks": 1284, "num_entries": 11094, "num_filter_entries": 11094, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580770, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.342751) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 18043053 bytes Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.344251) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 260.0 rd, 245.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 17.3 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(39.4) write-amplify(19.1) OK, records in: 11641, records dropped: 547 output_compression: NoCompression Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.344283) EVENT_LOG_v1 {"time_micros": 1771580770344267, "job": 14, "event": "compaction_finished", "compaction_time_micros": 73483, "compaction_time_cpu_micros": 38850, "output_level": 6, "num_output_files": 1, "total_output_size": 18043053, "num_input_records": 11641, "num_output_records": 11094, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770344592, "job": 14, "event": "table_file_deletion", "file_number": 29} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770347157, "job": 14, "event": "table_file_deletion", "file_number": 27} Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347221) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:10 localhost ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:10 localhost ceph-mon[288586]: Added label _no_schedule to host np0005625201.localdomain Feb 20 04:46:10 localhost ceph-mon[288586]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625201.localdomain Feb 20 04:46:10 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:10 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:11 localhost nova_compute[281288]: 2026-02-20 09:46:11.356 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:11 localhost nova_compute[281288]: 2026-02-20 09:46:11.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:11 localhost nova_compute[281288]: 2026-02-20 09:46:11.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:11 localhost nova_compute[281288]: 2026-02-20 09:46:11.359 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:11 localhost nova_compute[281288]: 2026-02-20 09:46:11.390 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:11 localhost nova_compute[281288]: 2026-02-20 09:46:11.391 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:11 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0) Feb 20 04:46:11 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:11 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0) Feb 20 04:46:11 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:11 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:11 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:11 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"}]': finished Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:12 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:12 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:12 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch Feb 20 04:46:12 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch Feb 20 04:46:12 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"}]': finished Feb 20 04:46:12 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:12 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:46:12 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:13 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:13 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 20 04:46:13 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 20 04:46:13 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:13 localhost ceph-mon[288586]: Removed host np0005625201.localdomain Feb 20 04:46:13 localhost ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:13 localhost ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:13 localhost ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:13 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:14 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:14 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:14 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:14 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:14 localhost sshd[299680]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:46:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:46:14 localhost ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:46:14 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:46:14 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:14 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:14 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:46:14 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 20 04:46:14 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 20 04:46:14 localhost systemd-logind[759]: New session 70 of user tripleo-admin. Feb 20 04:46:14 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 20 04:46:15 localhost systemd[1]: Starting User Manager for UID 1003... Feb 20 04:46:15 localhost podman[299682]: 2026-02-20 09:46:15.039343061 +0000 UTC m=+0.113265265 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:46:15 localhost podman[299682]: 2026-02-20 09:46:15.076232415 +0000 UTC m=+0.150154609 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:46:15 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:46:15 localhost systemd[299695]: Queued start job for default target Main User Target. Feb 20 04:46:15 localhost systemd[299695]: Created slice User Application Slice. Feb 20 04:46:15 localhost systemd[299695]: Started Mark boot as successful after the user session has run 2 minutes. Feb 20 04:46:15 localhost systemd[299695]: Started Daily Cleanup of User's Temporary Directories. Feb 20 04:46:15 localhost systemd[299695]: Reached target Paths. Feb 20 04:46:15 localhost systemd[299695]: Reached target Timers. Feb 20 04:46:15 localhost systemd[299695]: Starting D-Bus User Message Bus Socket... Feb 20 04:46:15 localhost systemd[299695]: Starting Create User's Volatile Files and Directories... Feb 20 04:46:15 localhost systemd[299695]: Finished Create User's Volatile Files and Directories. Feb 20 04:46:15 localhost systemd[299695]: Listening on D-Bus User Message Bus Socket. Feb 20 04:46:15 localhost systemd[299695]: Reached target Sockets. Feb 20 04:46:15 localhost systemd[299695]: Reached target Basic System. Feb 20 04:46:15 localhost systemd[299695]: Reached target Main User Target. Feb 20 04:46:15 localhost systemd[299695]: Startup finished in 150ms. Feb 20 04:46:15 localhost systemd[1]: Started User Manager for UID 1003. Feb 20 04:46:15 localhost systemd[1]: Started Session 70 of User tripleo-admin. Feb 20 04:46:15 localhost ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:46:15 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:15 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:15 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:15 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:15 localhost ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:46:15 localhost ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:46:15 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:15 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:15 localhost ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:46:15 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:46:15 localhost ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:46:16 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:16 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:16 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:16 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:16 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 20 04:46:16 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:16 localhost nova_compute[281288]: 2026-02-20 09:46:16.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:16 localhost nova_compute[281288]: 2026-02-20 09:46:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:16 localhost nova_compute[281288]: 2026-02-20 09:46:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:16 localhost nova_compute[281288]: 2026-02-20 09:46:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:16 localhost nova_compute[281288]: 2026-02-20 09:46:16.431 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:16 localhost nova_compute[281288]: 2026-02-20 09:46:16.431 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:46:16 localhost systemd[1]: tmp-crun.hH9G2T.mount: Deactivated successfully. Feb 20 04:46:16 localhost podman[299850]: 2026-02-20 09:46:16.538955473 +0000 UTC m=+0.093472326 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:46:16 localhost podman[299850]: 2026-02-20 09:46:16.549961124 +0000 UTC m=+0.104477967 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, release=1770267347, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.7) Feb 20 04:46:16 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:46:16 localhost python3[299849]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.105/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 20 04:46:17 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:17 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:17 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 20 04:46:17 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:17 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:17 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:17 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:17 localhost python3[300014]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.105/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:46:17 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 20 04:46:17 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost podman[241968]: time="2026-02-20T09:46:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:46:17 localhost podman[241968]: @ - - [20/Feb/2026:09:46:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:46:17 localhost podman[241968]: @ - - [20/Feb/2026:09:46:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18270 "" "Go-http-client/1.1" Feb 20 04:46:17 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:17 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:17 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:17 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:18 localhost python3[300159]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.105 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 04:46:18 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:46:18 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:46:18 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:18 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:18 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:18 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:46:18 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:46:18 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:18 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:46:18 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:18 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 20 04:46:18 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:46:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:46:19 localhost systemd[1]: tmp-crun.0ssSMT.mount: Deactivated successfully. Feb 20 04:46:19 localhost podman[300162]: 2026-02-20 09:46:19.156197881 +0000 UTC m=+0.089261377 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 20 04:46:19 localhost podman[300162]: 2026-02-20 09:46:19.189315988 +0000 UTC m=+0.122379524 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:46:19 localhost podman[300161]: 2026-02-20 09:46:19.203711785 +0000 UTC m=+0.139213260 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:46:19 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:46:19 localhost podman[300161]: 2026-02-20 09:46:19.244070726 +0000 UTC m=+0.179572171 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 20 04:46:19 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:46:19 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:19 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:19 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:19 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:19 localhost ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:46:19 localhost ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:19 localhost ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:19 localhost ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:19 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:46:20 localhost ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:46:20 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:20 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:20 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:20 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:20 localhost ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:46:20 localhost ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:46:20 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:20 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:20 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:46:21 localhost nova_compute[281288]: 2026-02-20 09:46:21.433 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:21 localhost nova_compute[281288]: 2026-02-20 09:46:21.435 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:21 localhost nova_compute[281288]: 2026-02-20 09:46:21.435 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:21 localhost nova_compute[281288]: 2026-02-20 09:46:21.435 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:21 localhost nova_compute[281288]: 2026-02-20 09:46:21.436 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:21 localhost nova_compute[281288]: 2026-02-20 09:46:21.438 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:21 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:46:21 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:21 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:21 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:21 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:21 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:21 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 20 04:46:21 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:21 localhost ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:46:21 localhost ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:46:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:21 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:21 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:22 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:22 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:22 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:22 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:22 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 20 04:46:22 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:22 localhost ceph-mon[288586]: Saving service mon spec with placement label:mon Feb 20 04:46:22 localhost ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:46:22 localhost ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:46:22 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:22 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:22 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:22 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:23 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:46:23 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:23 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:46:23 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:23 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 20 04:46:23 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:23 localhost ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:46:23 localhost ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:46:23 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:23 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:23 localhost ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:23 localhost ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:24 localhost podman[300272]: Feb 20 04:46:24 localhost podman[300272]: 2026-02-20 09:46:24.151922625 +0000 UTC m=+0.079582452 container create fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., release=1770267347, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:46:24 localhost systemd[1]: Started libpod-conmon-fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221.scope. Feb 20 04:46:24 localhost systemd[1]: Started libcrun container. Feb 20 04:46:24 localhost podman[300272]: 2026-02-20 09:46:24.118688234 +0000 UTC m=+0.046348091 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:24 localhost podman[300272]: 2026-02-20 09:46:24.231123995 +0000 UTC m=+0.158783822 container init fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:46:24 localhost systemd[1]: tmp-crun.S9p2ca.mount: Deactivated successfully. Feb 20 04:46:24 localhost podman[300272]: 2026-02-20 09:46:24.243900257 +0000 UTC m=+0.171560054 container start fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True) Feb 20 04:46:24 localhost podman[300272]: 2026-02-20 09:46:24.244164034 +0000 UTC m=+0.171823861 container attach fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., release=1770267347, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main) Feb 20 04:46:24 localhost zealous_nobel[300287]: 167 167 Feb 20 04:46:24 localhost systemd[1]: libpod-fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221.scope: Deactivated successfully. Feb 20 04:46:24 localhost podman[300272]: 2026-02-20 09:46:24.249484775 +0000 UTC m=+0.177144602 container died fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, version=7, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph) Feb 20 04:46:24 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "mon rm", "name": "np0005625204"} v 0) Feb 20 04:46:24 localhost ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625204"} : dispatch Feb 20 04:46:24 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fe78000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Feb 20 04:46:24 localhost podman[300292]: 2026-02-20 09:46:24.373196804 +0000 UTC m=+0.113996985 container remove fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph) Feb 20 04:46:24 localhost ceph-mon[288586]: mon.np0005625204@0(leader) e16 removed from monmap, suicide. Feb 20 04:46:24 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Feb 20 04:46:24 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Feb 20 04:46:24 localhost systemd[1]: libpod-conmon-fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221.scope: Deactivated successfully. Feb 20 04:46:24 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fcf20 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Feb 20 04:46:24 localhost podman[300317]: 2026-02-20 09:46:24.479172842 +0000 UTC m=+0.063827187 container died 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main) Feb 20 04:46:24 localhost podman[300317]: 2026-02-20 09:46:24.512027402 +0000 UTC m=+0.096681747 container remove 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:46:25 localhost systemd[1]: tmp-crun.PXEyS3.mount: Deactivated successfully. Feb 20 04:46:25 localhost systemd[1]: var-lib-containers-storage-overlay-9c21ab3a300335fcf797a3afe00a80fbacfe0d7a6765fcee7b83fec0346a2ac7-merged.mount: Deactivated successfully. Feb 20 04:46:25 localhost systemd[1]: var-lib-containers-storage-overlay-c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d-merged.mount: Deactivated successfully. Feb 20 04:46:25 localhost systemd[1]: ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8@mon.np0005625204.service: Deactivated successfully. Feb 20 04:46:25 localhost systemd[1]: Stopped Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 04:46:25 localhost systemd[1]: ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8@mon.np0005625204.service: Consumed 11.306s CPU time. Feb 20 04:46:25 localhost systemd[1]: Reloading. Feb 20 04:46:25 localhost systemd-sysv-generator[300537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:46:25 localhost systemd-rc-local-generator[300532]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:25 localhost podman[300547]: Feb 20 04:46:26 localhost podman[300547]: 2026-02-20 09:46:26.007276761 +0000 UTC m=+0.078583065 container create 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.openshift.expose-services=, release=1770267347, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux ) Feb 20 04:46:26 localhost systemd[1]: Started libpod-conmon-0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6.scope. Feb 20 04:46:26 localhost systemd[1]: Started libcrun container. Feb 20 04:46:26 localhost podman[300547]: 2026-02-20 09:46:25.977493578 +0000 UTC m=+0.048799912 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:26 localhost podman[300547]: 2026-02-20 09:46:26.09562789 +0000 UTC m=+0.166934204 container init 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1770267347, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph) Feb 20 04:46:26 localhost podman[300547]: 2026-02-20 09:46:26.108818393 +0000 UTC m=+0.180124697 container start 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, version=7, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container) Feb 20 04:46:26 localhost podman[300547]: 2026-02-20 09:46:26.109480851 +0000 UTC m=+0.180787165 container attach 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, version=7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public) Feb 20 04:46:26 localhost blissful_haibt[300563]: 167 167 Feb 20 04:46:26 localhost systemd[1]: libpod-0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6.scope: Deactivated successfully. Feb 20 04:46:26 localhost podman[300547]: 2026-02-20 09:46:26.114476413 +0000 UTC m=+0.185782727 container died 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, GIT_CLEAN=True, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux ) Feb 20 04:46:26 localhost systemd[1]: var-lib-containers-storage-overlay-14e5931968eb84cd13e0c455b2e48f93a206c0cbcfd1915fd787e5cf140c87df-merged.mount: Deactivated successfully. Feb 20 04:46:26 localhost podman[300568]: 2026-02-20 09:46:26.222940882 +0000 UTC m=+0.100165685 container remove 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:46:26 localhost systemd[1]: libpod-conmon-0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6.scope: Deactivated successfully. Feb 20 04:46:26 localhost nova_compute[281288]: 2026-02-20 09:46:26.439 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:26 localhost openstack_network_exporter[244414]: ERROR 09:46:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:46:26 localhost openstack_network_exporter[244414]: Feb 20 04:46:26 localhost openstack_network_exporter[244414]: ERROR 09:46:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:46:26 localhost openstack_network_exporter[244414]: Feb 20 04:46:27 localhost podman[300646]: Feb 20 04:46:27 localhost podman[300646]: 2026-02-20 09:46:27.089981509 +0000 UTC m=+0.078125682 container create 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:46:27 localhost systemd[1]: Started libpod-conmon-7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406.scope. Feb 20 04:46:27 localhost systemd[1]: Started libcrun container. Feb 20 04:46:27 localhost podman[300646]: 2026-02-20 09:46:27.054846084 +0000 UTC m=+0.042990317 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:27 localhost podman[300646]: 2026-02-20 09:46:27.157489978 +0000 UTC m=+0.145634161 container init 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=) Feb 20 04:46:27 localhost podman[300646]: 2026-02-20 09:46:27.166871063 +0000 UTC m=+0.155015266 container start 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:46:27 localhost podman[300646]: 2026-02-20 09:46:27.167221354 +0000 UTC m=+0.155365547 container attach 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, version=7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:46:27 localhost zealous_mendel[300662]: 167 167 Feb 20 04:46:27 localhost systemd[1]: libpod-7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406.scope: Deactivated successfully. Feb 20 04:46:27 localhost podman[300646]: 2026-02-20 09:46:27.170687342 +0000 UTC m=+0.158831545 container died 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True) Feb 20 04:46:27 localhost systemd[1]: tmp-crun.L49kiF.mount: Deactivated successfully. Feb 20 04:46:27 localhost systemd[1]: var-lib-containers-storage-overlay-ba842494c5f1f056888c4128a39db4121e7231ae3394372f2bdd8b93740e679d-merged.mount: Deactivated successfully. Feb 20 04:46:27 localhost podman[300667]: 2026-02-20 09:46:27.280994692 +0000 UTC m=+0.095205974 container remove 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, io.openshift.tags=rhceph ceph, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, io.buildah.version=1.42.2, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:46:27 localhost systemd[1]: libpod-conmon-7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406.scope: Deactivated successfully. Feb 20 04:46:28 localhost podman[300742]: Feb 20 04:46:28 localhost podman[300742]: 2026-02-20 09:46:28.110699153 +0000 UTC m=+0.076542276 container create 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, release=1770267347, name=rhceph, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:46:28 localhost systemd[1]: Started libpod-conmon-3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96.scope. Feb 20 04:46:28 localhost systemd[1]: Started libcrun container. Feb 20 04:46:28 localhost podman[300742]: 2026-02-20 09:46:28.172408129 +0000 UTC m=+0.138251242 container init 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, release=1770267347, maintainer=Guillaume Abrioux , ceph=True, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, GIT_BRANCH=main) Feb 20 04:46:28 localhost podman[300742]: 2026-02-20 09:46:28.080358576 +0000 UTC m=+0.046201719 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:28 localhost podman[300742]: 2026-02-20 09:46:28.184221414 +0000 UTC m=+0.150064497 container start 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, release=1770267347, io.openshift.expose-services=, io.buildah.version=1.42.2) Feb 20 04:46:28 localhost podman[300742]: 2026-02-20 09:46:28.184397749 +0000 UTC m=+0.150240902 container attach 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, release=1770267347, version=7, name=rhceph, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:46:28 localhost friendly_khayyam[300757]: 167 167 Feb 20 04:46:28 localhost systemd[1]: libpod-3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96.scope: Deactivated successfully. Feb 20 04:46:28 localhost podman[300742]: 2026-02-20 09:46:28.186543128 +0000 UTC m=+0.152386221 container died 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:46:28 localhost systemd[1]: var-lib-containers-storage-overlay-53bd40fadbf0de131abf706941ba00e28724579547fca12c5978dfd939b2ed9e-merged.mount: Deactivated successfully. Feb 20 04:46:28 localhost podman[300762]: 2026-02-20 09:46:28.279744776 +0000 UTC m=+0.081466126 container remove 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, distribution-scope=public, RELEASE=main, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Feb 20 04:46:28 localhost systemd[1]: libpod-conmon-3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96.scope: Deactivated successfully. Feb 20 04:46:28 localhost podman[300831]: Feb 20 04:46:28 localhost podman[300831]: 2026-02-20 09:46:28.965713421 +0000 UTC m=+0.080745525 container create 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, ceph=True, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:46:29 localhost systemd[1]: Started libpod-conmon-67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac.scope. Feb 20 04:46:29 localhost systemd[1]: Started libcrun container. Feb 20 04:46:29 localhost podman[300831]: 2026-02-20 09:46:29.02967363 +0000 UTC m=+0.144705734 container init 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:46:29 localhost podman[300831]: 2026-02-20 09:46:28.93491515 +0000 UTC m=+0.049947304 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:29 localhost podman[300831]: 2026-02-20 09:46:29.039005294 +0000 UTC m=+0.154037398 container start 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 20 04:46:29 localhost podman[300831]: 2026-02-20 09:46:29.039251571 +0000 UTC m=+0.154283715 container attach 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:46:29 localhost stoic_nightingale[300846]: 167 167 Feb 20 04:46:29 localhost systemd[1]: libpod-67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac.scope: Deactivated successfully. Feb 20 04:46:29 localhost podman[300831]: 2026-02-20 09:46:29.041983448 +0000 UTC m=+0.157015582 container died 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container) Feb 20 04:46:29 localhost podman[300851]: 2026-02-20 09:46:29.135050541 +0000 UTC m=+0.085122569 container remove 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=1770267347, vendor=Red Hat, Inc., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, RELEASE=main, build-date=2026-02-09T10:25:24Z) Feb 20 04:46:29 localhost systemd[1]: libpod-conmon-67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac.scope: Deactivated successfully. Feb 20 04:46:29 localhost systemd[1]: var-lib-containers-storage-overlay-15bbdd6125e1e7916d5dc8e574c690d7fb227c187df110f7f1dbe90376dc2093-merged.mount: Deactivated successfully. Feb 20 04:46:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:46:30 localhost podman[300978]: 2026-02-20 09:46:30.262979069 +0000 UTC m=+0.093024263 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:46:30 localhost systemd[1]: tmp-crun.bEg5c3.mount: Deactivated successfully. Feb 20 04:46:30 localhost podman[300997]: 2026-02-20 09:46:30.346860622 +0000 UTC m=+0.078925965 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute) Feb 20 04:46:30 localhost podman[300997]: 2026-02-20 09:46:30.35846838 +0000 UTC m=+0.090533723 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:46:30 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:46:30 localhost podman[300978]: 2026-02-20 09:46:30.395984071 +0000 UTC m=+0.226029255 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , name=rhceph, vcs-type=git, release=1770267347, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 20 04:46:31 localhost nova_compute[281288]: 2026-02-20 09:46:31.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:31 localhost nova_compute[281288]: 2026-02-20 09:46:31.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:31 localhost nova_compute[281288]: 2026-02-20 09:46:31.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:31 localhost nova_compute[281288]: 2026-02-20 09:46:31.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:31 localhost nova_compute[281288]: 2026-02-20 09:46:31.461 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:31 localhost nova_compute[281288]: 2026-02-20 09:46:31.461 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:33 localhost sshd[301508]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:46:36 localhost nova_compute[281288]: 2026-02-20 09:46:36.461 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:36 localhost nova_compute[281288]: 2026-02-20 09:46:36.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:36 localhost nova_compute[281288]: 2026-02-20 09:46:36.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:36 localhost nova_compute[281288]: 2026-02-20 09:46:36.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:36 localhost nova_compute[281288]: 2026-02-20 09:46:36.500 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:36 localhost nova_compute[281288]: 2026-02-20 09:46:36.501 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:38 localhost podman[301588]: Feb 20 04:46:38 localhost podman[301588]: 2026-02-20 09:46:38.196896738 +0000 UTC m=+0.064440754 container create 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z) Feb 20 04:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:46:38 localhost systemd[1]: Started libpod-conmon-92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4.scope. Feb 20 04:46:38 localhost systemd[1]: Started libcrun container. Feb 20 04:46:38 localhost podman[301588]: 2026-02-20 09:46:38.268724819 +0000 UTC m=+0.136268825 container init 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Feb 20 04:46:38 localhost podman[301588]: 2026-02-20 09:46:38.171773197 +0000 UTC m=+0.039317213 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:38 localhost podman[301588]: 2026-02-20 09:46:38.280513124 +0000 UTC m=+0.148057140 container start 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, release=1770267347, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container) Feb 20 04:46:38 localhost podman[301588]: 2026-02-20 09:46:38.280936995 +0000 UTC m=+0.148481041 container attach 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=) Feb 20 04:46:38 localhost great_ramanujan[301604]: 167 167 Feb 20 04:46:38 localhost systemd[1]: libpod-92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4.scope: Deactivated successfully. Feb 20 04:46:38 localhost podman[301588]: 2026-02-20 09:46:38.285404631 +0000 UTC m=+0.152948667 container died 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7) Feb 20 04:46:38 localhost podman[301603]: 2026-02-20 09:46:38.343892956 +0000 UTC m=+0.107862042 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:46:38 localhost podman[301618]: 2026-02-20 09:46:38.366190837 +0000 UTC m=+0.070779623 container remove 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:46:38 localhost systemd[1]: libpod-conmon-92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4.scope: Deactivated successfully. Feb 20 04:46:38 localhost podman[301603]: 2026-02-20 09:46:38.402926056 +0000 UTC m=+0.166895142 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:46:38 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:46:38 localhost podman[301649]: Feb 20 04:46:38 localhost podman[301649]: 2026-02-20 09:46:38.440049197 +0000 UTC m=+0.044691806 container create 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, vcs-type=git) Feb 20 04:46:38 localhost systemd[1]: Started libpod-conmon-62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d.scope. Feb 20 04:46:38 localhost systemd[1]: Started libcrun container. Feb 20 04:46:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:38 localhost podman[301649]: 2026-02-20 09:46:38.500186518 +0000 UTC m=+0.104829127 container init 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.42.2, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=) Feb 20 04:46:38 localhost podman[301649]: 2026-02-20 09:46:38.512601988 +0000 UTC m=+0.117244627 container start 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7) Feb 20 04:46:38 localhost podman[301649]: 2026-02-20 09:46:38.513815043 +0000 UTC m=+0.118457732 container attach 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347) Feb 20 04:46:38 localhost podman[301649]: 2026-02-20 09:46:38.422176841 +0000 UTC m=+0.026819470 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:38 localhost systemd[1]: libpod-62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d.scope: Deactivated successfully. Feb 20 04:46:38 localhost podman[301649]: 2026-02-20 09:46:38.566764661 +0000 UTC m=+0.171407290 container died 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2) Feb 20 04:46:38 localhost podman[301691]: 2026-02-20 09:46:38.626793399 +0000 UTC m=+0.054647376 container remove 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, maintainer=Guillaume Abrioux ) Feb 20 04:46:38 localhost systemd[1]: libpod-conmon-62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d.scope: Deactivated successfully. Feb 20 04:46:38 localhost systemd[1]: Reloading. Feb 20 04:46:38 localhost systemd-sysv-generator[301736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:46:38 localhost systemd-rc-local-generator[301732]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:38 localhost systemd[1]: tmp-crun.z5KUlp.mount: Deactivated successfully. Feb 20 04:46:38 localhost systemd[1]: var-lib-containers-storage-overlay-877f099077b7ebd5ab419ce7e06f0d641f94be10dea400192a038ac02adb2f37-merged.mount: Deactivated successfully. Feb 20 04:46:39 localhost systemd[1]: Reloading. Feb 20 04:46:39 localhost systemd-rc-local-generator[301771]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 20 04:46:39 localhost systemd-sysv-generator[301777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 20 04:46:39 localhost systemd[1]: Starting Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8... Feb 20 04:46:39 localhost podman[301839]: Feb 20 04:46:39 localhost podman[301839]: 2026-02-20 09:46:39.671009229 +0000 UTC m=+0.084203193 container create b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.openshift.expose-services=, name=rhceph) Feb 20 04:46:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff) Feb 20 04:46:39 localhost podman[301839]: 2026-02-20 09:46:39.728766783 +0000 UTC m=+0.141960747 container init b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, release=1770267347, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, vcs-type=git, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, ceph=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:46:39 localhost podman[301839]: 2026-02-20 09:46:39.634813965 +0000 UTC m=+0.048007969 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:39 localhost podman[301839]: 2026-02-20 09:46:39.743938592 +0000 UTC m=+0.157132556 container start b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, release=1770267347, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:46:39 localhost bash[301839]: b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c Feb 20 04:46:39 localhost systemd[1]: Started Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8. Feb 20 04:46:39 localhost ceph-mon[301857]: set uid:gid to 167:167 (ceph:ceph) Feb 20 04:46:39 localhost ceph-mon[301857]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2 Feb 20 04:46:39 localhost ceph-mon[301857]: pidfile_write: ignore empty --pid-file Feb 20 04:46:39 localhost ceph-mon[301857]: load: jerasure load: lrc Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: RocksDB version: 7.9.2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Git sha 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: DB SUMMARY Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: DB Session ID: OMQD63SADIG5WJVO9ZZI Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: CURRENT file: CURRENT Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: IDENTITY file: IDENTITY Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005625204/store.db dir, Total Num: 0, files: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005625204/store.db: 000004.log size: 636 ; Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.error_if_exists: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.create_if_missing: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.paranoid_checks: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.env: 0x559a1d482a20 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.fs: PosixFileSystem Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.info_log: 0x559a1eac8d20 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_file_opening_threads: 16 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.statistics: (nil) Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.use_fsync: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_log_file_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.log_file_time_to_roll: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.keep_log_file_num: 1000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.recycle_log_file_num: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.allow_fallocate: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.allow_mmap_reads: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.allow_mmap_writes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.use_direct_reads: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.create_missing_column_families: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.db_log_dir: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.wal_dir: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.table_cache_numshardbits: 6 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.advise_random_on_open: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.db_write_buffer_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.write_buffer_manager: 0x559a1ead9540 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.use_adaptive_mutex: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.rate_limiter: (nil) Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.wal_recovery_mode: 2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.enable_thread_tracking: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.enable_pipelined_write: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.unordered_write: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.row_cache: None Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.wal_filter: None Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.allow_ingest_behind: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.two_write_queues: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.manual_wal_flush: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.wal_compression: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.atomic_flush: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.persist_stats_to_disk: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.log_readahead_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.best_efforts_recovery: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.allow_data_in_errors: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.db_host_id: __hostname__ Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.enforce_single_del_contracts: true Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_background_jobs: 2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_background_compactions: -1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_subcompactions: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.delayed_write_rate : 16777216 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_total_wal_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.stats_dump_period_sec: 600 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.stats_persist_period_sec: 600 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_open_files: -1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bytes_per_sync: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_readahead_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_background_flushes: -1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Compression algorithms supported: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kZSTD supported: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kXpressCompression supported: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kBZip2Compression supported: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kLZ4Compression supported: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kZlibCompression supported: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: #011kSnappyCompression supported: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: DMutex implementation: pthread_mutex_t Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.merge_operator: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_filter: None Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_filter_factory: None Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.sst_partitioner_factory: None Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.memtable_factory: SkipListFactory Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.table_factory: BlockBasedTable Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559a1eac8980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559a1eac51f0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.write_buffer_size: 33554432 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_write_buffer_number: 2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression: NoCompression Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression: Disabled Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.prefix_extractor: nullptr Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.num_levels: 7 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.window_bits: -14 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.level: 32767 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.strategy: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.enabled: false Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.target_file_size_base: 67108864 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.target_file_size_multiplier: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_base: 268435456 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.arena_block_size: 1048576 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.disable_auto_compactions: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.table_properties_collectors: Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.inplace_update_support: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.memtable_huge_page_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.bloom_locality: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.max_successive_merges: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.paranoid_file_checks: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.force_consistency_checks: 1 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.report_bg_io_stats: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.ttl: 2592000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.enable_blob_files: false Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.min_blob_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.blob_file_size: 268435456 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.blob_compression_type: NoCompression Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.enable_blob_garbage_collection: false Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.blob_file_starting_level: 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 316f2b4e-6103-43ad-8119-3359f94ef991 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580799803444, "job": 1, "event": "recovery_started", "wal_files": [4]} Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580799805434, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580799805554, "job": 1, "event": "recovery_finished"} Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x559a1eaece00 Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: DB pointer 0x559a1ebe2000 Feb 20 04:46:39 localhost ceph-mon[301857]: mon.np0005625204 does not exist in monmap, will attempt to join an existing cluster Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:46:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.72 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Sum 1/0 1.72 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559a1eac51f0#2 capacity: 512.00 MB usage: 0.98 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.77 KB,0.000146031%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 20 04:46:39 localhost ceph-mon[301857]: using public_addr v2:172.18.0.105:0/0 -> [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] Feb 20 04:46:39 localhost ceph-mon[301857]: starting mon.np0005625204 rank -1 at public addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] at bind addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005625204 fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:46:39 localhost ceph-mon[301857]: mon.np0005625204@-1(???) e0 preinit fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 Feb 20 04:46:40 localhost systemd[1]: tmp-crun.NNjUiY.mount: Deactivated successfully. Feb 20 04:46:41 localhost nova_compute[281288]: 2026-02-20 09:46:41.502 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:41 localhost nova_compute[281288]: 2026-02-20 09:46:41.503 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:41 localhost nova_compute[281288]: 2026-02-20 09:46:41.504 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:41 localhost nova_compute[281288]: 2026-02-20 09:46:41.504 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:41 localhost nova_compute[281288]: 2026-02-20 09:46:41.536 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:41 localhost nova_compute[281288]: 2026-02-20 09:46:41.537 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:43 localhost podman[301948]: Feb 20 04:46:43 localhost podman[301948]: 2026-02-20 09:46:43.895182934 +0000 UTC m=+0.075340872 container create 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:46:43 localhost systemd[1]: Started libpod-conmon-701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264.scope. Feb 20 04:46:43 localhost systemd[1]: Started libcrun container. Feb 20 04:46:43 localhost podman[301948]: 2026-02-20 09:46:43.864357702 +0000 UTC m=+0.044515740 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:46:43 localhost podman[301948]: 2026-02-20 09:46:43.965960996 +0000 UTC m=+0.146118934 container init 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.42.2, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z) Feb 20 04:46:43 localhost podman[301948]: 2026-02-20 09:46:43.975789415 +0000 UTC m=+0.155947383 container start 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, RELEASE=main, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:46:43 localhost podman[301948]: 2026-02-20 09:46:43.976056382 +0000 UTC m=+0.156214340 container attach 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, release=1770267347, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Feb 20 04:46:43 localhost busy_taussig[301962]: 167 167 Feb 20 04:46:43 localhost systemd[1]: libpod-701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264.scope: Deactivated successfully. Feb 20 04:46:43 localhost podman[301948]: 2026-02-20 09:46:43.980496137 +0000 UTC m=+0.160654125 container died 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, version=7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True) Feb 20 04:46:44 localhost podman[301967]: 2026-02-20 09:46:44.057049424 +0000 UTC m=+0.066724719 container remove 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:46:44 localhost systemd[1]: libpod-conmon-701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264.scope: Deactivated successfully. Feb 20 04:46:44 localhost systemd[1]: tmp-crun.tv3LaQ.mount: Deactivated successfully. Feb 20 04:46:44 localhost systemd[1]: var-lib-containers-storage-overlay-92c5f6363cedad4fc4c69b0160487342e6839da5c00a7b13ffebb821e3f3ff20-merged.mount: Deactivated successfully. Feb 20 04:46:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:46:45 localhost systemd[1]: tmp-crun.a18jMS.mount: Deactivated successfully. Feb 20 04:46:45 localhost podman[302093]: 2026-02-20 09:46:45.149884498 +0000 UTC m=+0.110260130 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1770267347, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:46:45 localhost systemd[1]: tmp-crun.cREDQI.mount: Deactivated successfully. Feb 20 04:46:45 localhost podman[302111]: 2026-02-20 09:46:45.261083823 +0000 UTC m=+0.103873999 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:46:45 localhost podman[302111]: 2026-02-20 09:46:45.276913782 +0000 UTC m=+0.119703938 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:46:45 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:46:45 localhost podman[302093]: 2026-02-20 09:46:45.335498099 +0000 UTC m=+0.295873701 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing) e16 sync_obtain_latest_monmap Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing) e16 sync_obtain_latest_monmap obtained monmap e16 Feb 20 04:46:46 localhost sshd[302240]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:46:46 localhost nova_compute[281288]: 2026-02-20 09:46:46.538 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:46 localhost nova_compute[281288]: 2026-02-20 09:46:46.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:46 localhost nova_compute[281288]: 2026-02-20 09:46:46.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:46:46 localhost nova_compute[281288]: 2026-02-20 09:46:46.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:46 localhost nova_compute[281288]: 2026-02-20 09:46:46.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:46 localhost nova_compute[281288]: 2026-02-20 09:46:46.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).mds e17 new map Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01116#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-20T07:58:28.398421+0000#012modified#0112026-02-20T09:40:14.722031+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01183#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26854}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26854 members: 26854#012[mds.mds.np0005625203.zsrwgk{0:26854} state up:active seq 13 addr [v2:172.18.0.107:6808/3334119751,v1:172.18.0.107:6809/3334119751] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005625202.akhmop{-1:17124} state up:standby seq 1 addr [v2:172.18.0.106:6808/3865978972,v1:172.18.0.106:6809/3865978972] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005625204.wnsphl{-1:26848} state up:standby seq 1 addr [v2:172.18.0.108:6808/2508223371,v1:172.18.0.108:6809/2508223371] compat {c=[1],r=[1],i=[17ff]}] Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 3314933000852226048, adjusting msgr requires Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625201 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625203 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2,3) Feb 20 04:46:46 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:46:46 localhost ceph-mon[301857]: Remove daemons mon.np0005625201 Feb 20 04:46:46 localhost ceph-mon[301857]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203']) Feb 20 04:46:46 localhost ceph-mon[301857]: Removing monitor np0005625201 from monmap... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625201"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports [] Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202 in quorum (ranks 0,1) Feb 20 04:46:46 localhost ceph-mon[301857]: Health check failed: 1/3 mons down, quorum np0005625204,np0005625202 (MON_DOWN) Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005625204,np0005625202 Feb 20 04:46:46 localhost ceph-mon[301857]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005625204,np0005625202 Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625203 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625203 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2) Feb 20 04:46:46 localhost ceph-mon[301857]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625204,np0005625202) Feb 20 04:46:46 localhost ceph-mon[301857]: Cluster is now healthy Feb 20 04:46:46 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Deploying daemon mon.np0005625201 on np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: Removed label mon from host np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Removed label mgr from host np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: Removed label _admin from host np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625203 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625201 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203,np0005625201 in quorum (ranks 0,1,2,3) Feb 20 04:46:46 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Removing np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:46:46 localhost ceph-mon[301857]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Removing daemon mgr.np0005625201.mtnyvu from np0005625201.localdomain -- ports [8765] Feb 20 04:46:46 localhost ceph-mon[301857]: Removing key for mgr.np0005625201.mtnyvu Feb 20 04:46:46 localhost ceph-mon[301857]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203']) Feb 20 04:46:46 localhost ceph-mon[301857]: Removing monitor np0005625201 from monmap... Feb 20 04:46:46 localhost ceph-mon[301857]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports [] Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625203 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2) Feb 20 04:46:46 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Added label _no_schedule to host np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"}]': finished Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Removed host np0005625201.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Saving service mon spec with placement label:mon Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: Remove daemons mon.np0005625204 Feb 20 04:46:46 localhost ceph-mon[301857]: Safe to remove mon.np0005625204: new quorum should be ['np0005625202', 'np0005625203'] (from ['np0005625202', 'np0005625203']) Feb 20 04:46:46 localhost ceph-mon[301857]: Removing monitor np0005625204 from monmap... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625204"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Removing daemon mon.np0005625204 from np0005625204.localdomain -- ports [] Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625203 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625202 is new leader, mons np0005625202,np0005625203 in quorum (ranks 0,1) Feb 20 04:46:46 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.0 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.3 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Deploying daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:46:46 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:46 localhost ceph-mon[301857]: mon.np0005625204@-1(synchronizing).paxosservice(auth 1..43) refresh upgraded, format 0 -> 3 Feb 20 04:46:46 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fe789a0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Feb 20 04:46:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:46:47 localhost podman[302260]: 2026-02-20 09:46:47.149673206 +0000 UTC m=+0.083073097 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, version=9.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=) Feb 20 04:46:47 localhost podman[302260]: 2026-02-20 09:46:47.164026796 +0000 UTC m=+0.097426627 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:46:47 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:46:47 localhost podman[241968]: time="2026-02-20T09:46:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:46:47 localhost podman[241968]: @ - - [20/Feb/2026:09:46:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:46:47 localhost nova_compute[281288]: 2026-02-20 09:46:47.724 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:47 localhost podman[241968]: @ - - [20/Feb/2026:09:46:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18266 "" "Go-http-client/1.1" Feb 20 04:46:47 localhost nova_compute[281288]: 2026-02-20 09:46:47.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:46:47 localhost nova_compute[281288]: 2026-02-20 09:46:47.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:46:47 localhost nova_compute[281288]: 2026-02-20 09:46:47.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:46:47 localhost nova_compute[281288]: 2026-02-20 09:46:47.744 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:46:47 localhost nova_compute[281288]: 2026-02-20 09:46:47.744 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:46:47 localhost ceph-mon[301857]: mon.np0005625204@-1(probing) e16 handle_auth_request failed to assign global_id Feb 20 04:46:48 localhost ceph-mon[301857]: mon.np0005625204@-1(probing) e16 handle_auth_request failed to assign global_id Feb 20 04:46:48 localhost ceph-mon[301857]: mon.np0005625204@-1(probing) e16 handle_auth_request failed to assign global_id Feb 20 04:46:48 localhost ceph-mon[301857]: mon.np0005625204@-1(probing) e17 my rank is now 2 (was -1) Feb 20 04:46:48 localhost ceph-mon[301857]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election Feb 20 04:46:48 localhost ceph-mon[301857]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 Feb 20 04:46:48 localhost ceph-mon[301857]: mon.np0005625204@2(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:49 localhost ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id Feb 20 04:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:46:50 localhost podman[302292]: 2026-02-20 09:46:50.147040509 +0000 UTC m=+0.077975511 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:46:50 localhost podman[302292]: 2026-02-20 09:46:50.157032335 +0000 UTC m=+0.087967357 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Feb 20 04:46:50 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:46:50 localhost podman[302291]: 2026-02-20 09:46:50.25672683 +0000 UTC m=+0.190739536 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:46:50 localhost podman[302291]: 2026-02-20 09:46:50.35426412 +0000 UTC m=+0.288276816 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 20 04:46:50 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:46:50 localhost ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id Feb 20 04:46:51 localhost nova_compute[281288]: 2026-02-20 09:46:51.576 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:46:51 localhost nova_compute[281288]: 2026-02-20 09:46:51.578 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 20 04:46:51 localhost ceph-mon[301857]: mgrc update_daemon_metadata mon.np0005625204 metadata {addrs=[v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005625204.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005625204.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_auth_request failed to assign global_id Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625202 calling monitor election Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625203 calling monitor election Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625204 calling monitor election Feb 20 04:46:51 localhost ceph-mon[301857]: mon.np0005625202 is new leader, mons np0005625202,np0005625203,np0005625204 in quorum (ranks 0,1,2) Feb 20 04:46:51 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:46:51 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:46:52 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/542821844' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.680 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.935s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.749 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.751 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.953 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.955 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11740MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.956 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:46:52 localhost nova_compute[281288]: 2026-02-20 09:46:52.956 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.023 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.024 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.024 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.061 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:46:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:46:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1416339821' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.519731) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813519855, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11856, "num_deletes": 254, "total_data_size": 20741104, "memory_usage": 21988520, "flush_reason": "Manual Compaction"} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.522 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.530 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.553 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.557 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:46:53 localhost nova_compute[281288]: 2026-02-20 09:46:53.558 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813590524, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 16372900, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11861, "table_properties": {"data_size": 16308556, "index_size": 34542, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28741, "raw_key_size": 307741, "raw_average_key_size": 26, "raw_value_size": 16114056, "raw_average_value_size": 1404, "num_data_blocks": 1305, "num_entries": 11475, "num_filter_entries": 11475, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580806, "oldest_key_time": 1771580806, "file_creation_time": 1771580813, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 70996 microseconds, and 32368 cpu microseconds. Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.590730) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 16372900 bytes OK Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.590798) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.592765) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.592791) EVENT_LOG_v1 {"time_micros": 1771580813592783, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.592817) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 20658698, prev total WAL file size 20680587, number of live WAL files 2. Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.598005) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(15MB) 8(1762B)] Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813598144, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 16374662, "oldest_snapshot_seqno": -1} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 11225 keys, 16369345 bytes, temperature: kUnknown Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813678336, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 16369345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16305678, "index_size": 34510, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28101, "raw_key_size": 302972, "raw_average_key_size": 26, "raw_value_size": 16114461, "raw_average_value_size": 1435, "num_data_blocks": 1304, "num_entries": 11225, "num_filter_entries": 11225, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771580813, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.678750) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 16369345 bytes Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.680716) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.9 rd, 203.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(15.6, 0.0 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 11480, records dropped: 255 output_compression: NoCompression Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.680749) EVENT_LOG_v1 {"time_micros": 1771580813680733, "job": 4, "event": "compaction_finished", "compaction_time_micros": 80315, "compaction_time_cpu_micros": 46153, "output_level": 6, "num_output_files": 1, "total_output_size": 16369345, "num_input_records": 11480, "num_output_records": 11225, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813683364, "job": 4, "event": "table_file_deletion", "file_number": 14} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813683432, "job": 4, "event": "table_file_deletion", "file_number": 8} Feb 20 04:46:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.597863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:53 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:54 localhost ceph-mon[301857]: Reconfig service osd.default_drive_group Feb 20 04:46:54 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:54 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:54 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:46:55 localhost sshd[302688]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:46:55 localhost nova_compute[281288]: 2026-02-20 09:46:55.555 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:55 localhost nova_compute[281288]: 2026-02-20 09:46:55.557 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:55 localhost nova_compute[281288]: 2026-02-20 09:46:55.557 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:46:55 localhost nova_compute[281288]: 2026-02-20 09:46:55.558 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:46:55 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:55 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:55 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:55 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.157 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.157 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.158 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.158 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:46:56 localhost openstack_network_exporter[244414]: ERROR 09:46:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:46:56 localhost openstack_network_exporter[244414]: Feb 20 04:46:56 localhost openstack_network_exporter[244414]: ERROR 09:46:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:46:56 localhost openstack_network_exporter[244414]: Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:46:56 localhost ceph-mon[301857]: Reconfiguring crash.np0005625202 (monmap changed)... Feb 20 04:46:56 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain Feb 20 04:46:56 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:56 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:56 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 20 04:46:56 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.836 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.853 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.853 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.854 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.854 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.854 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:46:56 localhost nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:46:57 localhost ceph-mon[301857]: Reconfiguring osd.2 (monmap changed)... Feb 20 04:46:57 localhost ceph-mon[301857]: Reconfiguring daemon osd.2 on np0005625202.localdomain Feb 20 04:46:57 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:57 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:57 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:57 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:57 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e89 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e89 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 e90: 6 total, 6 up, 6 in Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr handle_mgr_map Activating! Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr handle_mgr_map I am now activating Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005625202"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005625203"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005625204"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 0 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 0 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 0 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 0} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 1} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 2} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 3} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 4} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 5} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 1 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata"} : dispatch Feb 20 04:46:58 localhost ceph-mgr[287186]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: balancer Feb 20 04:46:58 localhost ceph-mgr[287186]: [balancer INFO root] Starting Feb 20 04:46:58 localhost ceph-mgr[287186]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: [balancer INFO root] Optimize plan auto_2026-02-20_09:46:58 Feb 20 04:46:58 localhost ceph-mgr[287186]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Feb 20 04:46:58 localhost ceph-mgr[287186]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Feb 20 04:46:58 localhost systemd[1]: session-69.scope: Deactivated successfully. Feb 20 04:46:58 localhost systemd[1]: session-69.scope: Consumed 26.523s CPU time. Feb 20 04:46:58 localhost systemd-logind[759]: Session 69 logged out. Waiting for processes to exit. Feb 20 04:46:58 localhost systemd-logind[759]: Removed session 69. Feb 20 04:46:58 localhost ceph-mgr[287186]: [cephadm WARNING root] removing stray HostCache host record np0005625201.localdomain.devices.0 Feb 20 04:46:58 localhost ceph-mgr[287186]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005625201.localdomain.devices.0 Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: cephadm Feb 20 04:46:58 localhost ceph-mgr[287186]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: crash Feb 20 04:46:58 localhost ceph-mgr[287186]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: devicehealth Feb 20 04:46:58 localhost ceph-mgr[287186]: [devicehealth INFO root] Starting Feb 20 04:46:58 localhost ceph-mgr[287186]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: iostat Feb 20 04:46:58 localhost ceph-mgr[287186]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: nfs Feb 20 04:46:58 localhost ceph-mgr[287186]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: orchestrator Feb 20 04:46:58 localhost ceph-mgr[287186]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: pg_autoscaler Feb 20 04:46:58 localhost ceph-mgr[287186]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: progress Feb 20 04:46:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] _maybe_adjust Feb 20 04:46:58 localhost ceph-mgr[287186]: [progress INFO root] Loading... Feb 20 04:46:58 localhost ceph-mgr[287186]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: [progress INFO root] Loaded OSDMap, ready. Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] recovery thread starting Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] starting setup Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: rbd_support Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} : dispatch Feb 20 04:46:58 localhost ceph-mgr[287186]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: restful Feb 20 04:46:58 localhost ceph-mgr[287186]: [restful INFO root] server_addr: :: server_port: 8003 Feb 20 04:46:58 localhost ceph-mgr[287186]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: status Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: [restful WARNING root] server not running: no certificate configured Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: telemetry Feb 20 04:46:58 localhost ceph-mgr[287186]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] PerfHandler: starting Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_task_task: vms, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Feb 20 04:46:58 localhost ceph-mgr[287186]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Feb 20 04:46:58 localhost ceph-mgr[287186]: mgr load Constructed class from module: volumes Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_task_task: volumes, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_task_task: images, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_task_task: backups, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] TaskHandler: starting Feb 20 04:46:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} v 0) Feb 20 04:46:58 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} : dispatch Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Feb 20 04:46:58 localhost ceph-mgr[287186]: [rbd_support INFO root] setup complete Feb 20 04:46:58 localhost sshd[302847]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:46:58 localhost systemd-logind[759]: New session 72 of user ceph-admin. Feb 20 04:46:58 localhost systemd[1]: Started Session 72 of User ceph-admin. Feb 20 04:46:58 localhost ceph-mon[301857]: Reconfiguring osd.5 (monmap changed)... Feb 20 04:46:58 localhost ceph-mon[301857]: Reconfiguring daemon osd.5 on np0005625202.localdomain Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='client.? 172.18.0.200:0/1025406798' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: Activating manager daemon np0005625204.exgrzx Feb 20 04:46:58 localhost ceph-mon[301857]: from='client.? 172.18.0.200:0/1025406798' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 20 04:46:58 localhost ceph-mon[301857]: Manager daemon np0005625204.exgrzx is now available Feb 20 04:46:58 localhost ceph-mon[301857]: removing stray HostCache host record np0005625201.localdomain.devices.0 Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"}]': finished Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"}]': finished Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} : dispatch Feb 20 04:46:58 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} : dispatch Feb 20 04:46:59 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:46:59 localhost ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Bus STARTING Feb 20 04:46:59 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Bus STARTING Feb 20 04:46:59 localhost systemd[1]: tmp-crun.TUf2OM.mount: Deactivated successfully. Feb 20 04:46:59 localhost podman[302959]: 2026-02-20 09:46:59.809805069 +0000 UTC m=+0.109136035 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, architecture=x86_64, vcs-type=git, RELEASE=main, GIT_BRANCH=main, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:46:59 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1019817561 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:46:59 localhost ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Serving on http://172.18.0.108:8765 Feb 20 04:46:59 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Serving on http://172.18.0.108:8765 Feb 20 04:46:59 localhost podman[302959]: 2026-02-20 09:46:59.960064454 +0000 UTC m=+0.259395420 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.buildah.version=1.42.2, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:46:59 localhost ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Serving on https://172.18.0.108:7150 Feb 20 04:46:59 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Serving on https://172.18.0.108:7150 Feb 20 04:46:59 localhost ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Bus STARTED Feb 20 04:46:59 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Bus STARTED Feb 20 04:46:59 localhost ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Client ('172.18.0.108', 51320) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:46:59 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Client ('172.18.0.108', 51320) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:47:00 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:47:00 localhost ceph-mgr[287186]: [devicehealth INFO root] Check health Feb 20 04:47:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:47:00 localhost podman[303084]: 2026-02-20 09:47:00.597260743 +0000 UTC m=+0.100656526 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 20 04:47:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:47:00 localhost podman[303084]: 2026-02-20 09:47:00.609816467 +0000 UTC m=+0.113212320 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:47:00 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:47:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:01 localhost ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Bus STARTING Feb 20 04:47:01 localhost ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Serving on http://172.18.0.108:8765 Feb 20 04:47:01 localhost ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Serving on https://172.18.0.108:7150 Feb 20 04:47:01 localhost ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Bus STARTED Feb 20 04:47:01 localhost ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Client ('172.18.0.108', 51320) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:47:01 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:01 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:01 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:01 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:01 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:01 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:01 localhost nova_compute[281288]: 2026-02-20 09:47:01.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:01 localhost nova_compute[281288]: 2026-02-20 09:47:01.588 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO root] Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO root] Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO root] Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 20 04:47:02 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:02 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:47:02 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:02 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:03 localhost ceph-mgr[287186]: mgr.server handle_open ignoring open from mgr.np0005625203.lonygy 172.18.0.107:0/1974106064; not ready for session (expect reconnect) Feb 20 04:47:03 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:03 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:03 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:03 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: mgr.server handle_open ignoring open from mgr.np0005625203.lonygy 172.18.0.107:0/1974106064; not ready for session (expect reconnect) Feb 20 04:47:04 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s Feb 20 04:47:04 localhost sshd[303804]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} v 0) Feb 20 04:47:04 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch Feb 20 04:47:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:47:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:47:04 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020051072 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 20 04:47:05 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 0 B/s wr, 18 op/s Feb 20 04:47:05 localhost ceph-mgr[287186]: [progress INFO root] update: starting ev 337b38bf-9a31-4bb9-9a16-f63a4dbf816e (Updating node-proxy deployment (+3 -> 3)) Feb 20 04:47:05 localhost ceph-mgr[287186]: [progress INFO root] complete: finished ev 337b38bf-9a31-4bb9-9a16-f63a4dbf816e (Updating node-proxy deployment (+3 -> 3)) Feb 20 04:47:05 localhost ceph-mgr[287186]: [progress INFO root] Completed event 337b38bf-9a31-4bb9-9a16-f63a4dbf816e (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 20 04:47:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 20 04:47:05 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:05 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:47:05 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:47:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 20 04:47:05 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:05 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:05 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:47:05 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:47:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:47:06.010 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:47:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:47:06.010 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:47:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:47:06.011 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:47:06 localhost nova_compute[281288]: 2026-02-20 09:47:06.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:06 localhost nova_compute[281288]: 2026-02-20 09:47:06.588 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:47:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:47:06 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:47:06 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:47:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 20 04:47:06 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 20 04:47:06 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch Feb 20 04:47:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:06 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:06 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:47:06 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:47:06 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)... Feb 20 04:47:06 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:06 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain Feb 20 04:47:06 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:06 localhost ceph-mon[301857]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 20 04:47:06 localhost ceph-mon[301857]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 20 04:47:06 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:06 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:06 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:06 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:07 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 14 op/s Feb 20 04:47:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:47:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:47:07 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:47:07 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:47:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 20 04:47:07 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:47:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:07 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:07 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:47:07 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:47:08 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)... Feb 20 04:47:08 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain Feb 20 04:47:08 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:08 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:47:08 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:08 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:47:08 localhost ceph-mgr[287186]: [progress INFO root] Writing back 50 completed events Feb 20 04:47:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 20 04:47:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:08 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Feb 20 04:47:08 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Feb 20 04:47:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 20 04:47:08 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:47:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:08 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:08 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:47:08 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:47:09 localhost podman[303913]: 2026-02-20 09:47:09.136226523 +0000 UTC m=+0.066966134 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:47:09 localhost podman[303913]: 2026-02-20 09:47:09.177146137 +0000 UTC m=+0.107885768 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:47:09 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:47:09 localhost ceph-mon[301857]: Reconfiguring crash.np0005625203 (monmap changed)... Feb 20 04:47:09 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain Feb 20 04:47:09 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:09 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:09 localhost ceph-mon[301857]: Reconfiguring osd.1 (monmap changed)... Feb 20 04:47:09 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 20 04:47:09 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:09 localhost ceph-mon[301857]: Reconfiguring daemon osd.1 on np0005625203.localdomain Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:09 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:09 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Feb 20 04:47:09 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 20 04:47:09 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:09 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:09 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:47:09 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:47:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054674 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:10 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:10 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:10 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:10 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:10 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 20 04:47:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:10 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:47:10 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:47:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 20 04:47:10 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:10 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:10 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:47:10 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:47:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:11 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:47:11 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:47:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 20 04:47:11 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 20 04:47:11 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch Feb 20 04:47:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:11 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:11 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:47:11 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:47:11 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s Feb 20 04:47:11 localhost ceph-mon[301857]: Reconfiguring osd.4 (monmap changed)... Feb 20 04:47:11 localhost ceph-mon[301857]: Reconfiguring daemon osd.4 on np0005625203.localdomain Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:11 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:11 localhost nova_compute[281288]: 2026-02-20 09:47:11.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:12 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:47:12 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:47:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 20 04:47:12 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:47:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:12 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:12 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:47:12 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:47:12 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)... Feb 20 04:47:12 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain Feb 20 04:47:12 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)... Feb 20 04:47:12 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain Feb 20 04:47:12 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:12 localhost ceph-mon[301857]: Reconfiguring crash.np0005625204 (monmap changed)... Feb 20 04:47:12 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:47:12 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:12 localhost ceph-mon[301857]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain Feb 20 04:47:12 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 20 04:47:12 localhost podman[303990]: Feb 20 04:47:12 localhost podman[303990]: 2026-02-20 09:47:12.909110222 +0000 UTC m=+0.075186225 container create bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, io.openshift.expose-services=, release=1770267347, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 20 04:47:12 localhost systemd[1]: Started libpod-conmon-bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1.scope. Feb 20 04:47:12 localhost systemd[1]: Started libcrun container. Feb 20 04:47:12 localhost podman[303990]: 2026-02-20 09:47:12.881213487 +0000 UTC m=+0.047289500 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:47:12 localhost podman[303990]: 2026-02-20 09:47:12.994244361 +0000 UTC m=+0.160320364 container init bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public) Feb 20 04:47:13 localhost systemd[1]: tmp-crun.LazvEH.mount: Deactivated successfully. Feb 20 04:47:13 localhost podman[303990]: 2026-02-20 09:47:13.013727098 +0000 UTC m=+0.179803101 container start bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, release=1770267347, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Feb 20 04:47:13 localhost podman[303990]: 2026-02-20 09:47:13.013932715 +0000 UTC m=+0.180008728 container attach bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=1770267347, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 20 04:47:13 localhost vibrant_jepsen[304005]: 167 167 Feb 20 04:47:13 localhost systemd[1]: libpod-bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1.scope: Deactivated successfully. Feb 20 04:47:13 localhost podman[303990]: 2026-02-20 09:47:13.018536966 +0000 UTC m=+0.184613039 container died bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, version=7, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7) Feb 20 04:47:13 localhost podman[304010]: 2026-02-20 09:47:13.129076643 +0000 UTC m=+0.096483057 container remove bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:47:13 localhost systemd[1]: libpod-conmon-bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1.scope: Deactivated successfully. Feb 20 04:47:13 localhost ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44559 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 20 04:47:13 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:13 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:13 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Feb 20 04:47:13 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Feb 20 04:47:13 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Feb 20 04:47:13 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:47:13 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:13 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:13 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:47:13 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:47:13 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s Feb 20 04:47:13 localhost podman[304081]: Feb 20 04:47:13 localhost podman[304081]: 2026-02-20 09:47:13.852944229 +0000 UTC m=+0.077169277 container create a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z) Feb 20 04:47:13 localhost systemd[1]: Started libpod-conmon-a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24.scope. Feb 20 04:47:13 localhost systemd[1]: Started libcrun container. Feb 20 04:47:13 localhost systemd[1]: var-lib-containers-storage-overlay-b8a74d72450a7e3cc2d43c0da7d92e4e7f1a6b3a4ea509e7ba96765527b8cd3f-merged.mount: Deactivated successfully. Feb 20 04:47:13 localhost podman[304081]: 2026-02-20 09:47:13.823578238 +0000 UTC m=+0.047803276 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:47:13 localhost podman[304081]: 2026-02-20 09:47:13.924671576 +0000 UTC m=+0.148896614 container init a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux ) Feb 20 04:47:13 localhost podman[304081]: 2026-02-20 09:47:13.934518039 +0000 UTC m=+0.158743077 container start a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux ) Feb 20 04:47:13 localhost podman[304081]: 2026-02-20 09:47:13.934825698 +0000 UTC m=+0.159050786 container attach a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, architecture=x86_64, build-date=2026-02-09T10:25:24Z, release=1770267347, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:47:13 localhost trusting_cerf[304097]: 167 167 Feb 20 04:47:13 localhost systemd[1]: libpod-a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24.scope: Deactivated successfully. Feb 20 04:47:13 localhost podman[304081]: 2026-02-20 09:47:13.939053818 +0000 UTC m=+0.163278906 container died a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, version=7) Feb 20 04:47:14 localhost podman[304102]: 2026-02-20 09:47:14.045918613 +0000 UTC m=+0.093453046 container remove a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, release=1770267347, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:47:14 localhost systemd[1]: libpod-conmon-a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24.scope: Deactivated successfully. Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:14 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:14 localhost ceph-mon[301857]: Reconfiguring osd.0 (monmap changed)... Feb 20 04:47:14 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 20 04:47:14 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:14 localhost ceph-mon[301857]: Reconfiguring daemon osd.0 on np0005625204.localdomain Feb 20 04:47:14 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:14 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Feb 20 04:47:14 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Feb 20 04:47:14 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:14 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:14 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:47:14 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:47:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:14 localhost systemd[1]: var-lib-containers-storage-overlay-814a2e032646545d5e27fe113f9120c8ce9e759df56810fd845e7402bcd46083-merged.mount: Deactivated successfully. Feb 20 04:47:14 localhost podman[304178]: Feb 20 04:47:14 localhost podman[304178]: 2026-02-20 09:47:14.957381577 +0000 UTC m=+0.084334836 container create 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Feb 20 04:47:15 localhost systemd[1]: Started libpod-conmon-2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48.scope. Feb 20 04:47:15 localhost podman[304178]: 2026-02-20 09:47:14.921854268 +0000 UTC m=+0.048807557 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:47:15 localhost systemd[1]: Started libcrun container. Feb 20 04:47:15 localhost podman[304178]: 2026-02-20 09:47:15.038785772 +0000 UTC m=+0.165739041 container init 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, name=rhceph, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:47:15 localhost podman[304178]: 2026-02-20 09:47:15.049744678 +0000 UTC m=+0.176697947 container start 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1770267347, name=rhceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_CLEAN=True, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 20 04:47:15 localhost podman[304178]: 2026-02-20 09:47:15.050079798 +0000 UTC m=+0.177033077 container attach 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Feb 20 04:47:15 localhost vigilant_liskov[304194]: 167 167 Feb 20 04:47:15 localhost systemd[1]: libpod-2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48.scope: Deactivated successfully. Feb 20 04:47:15 localhost podman[304178]: 2026-02-20 09:47:15.053057489 +0000 UTC m=+0.180010818 container died 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.42.2, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:47:15 localhost podman[304199]: 2026-02-20 09:47:15.159868952 +0000 UTC m=+0.092926438 container remove 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:47:15 localhost systemd[1]: libpod-conmon-2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48.scope: Deactivated successfully. Feb 20 04:47:15 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:15 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:15 localhost ceph-mon[301857]: Reconfiguring osd.3 (monmap changed)... Feb 20 04:47:15 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 20 04:47:15 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:15 localhost ceph-mon[301857]: Reconfiguring daemon osd.3 on np0005625204.localdomain Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:15 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 20 04:47:15 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:15 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:15 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:15 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:47:15 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:47:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:47:15 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:15 localhost podman[304240]: 2026-02-20 09:47:15.588679814 +0000 UTC m=+0.092889777 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:47:15 localhost podman[304240]: 2026-02-20 09:47:15.630030792 +0000 UTC m=+0.134240715 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:47:15 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:47:15 localhost ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44562 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Feb 20 04:47:15 localhost ceph-mgr[287186]: [cephadm INFO root] Saving service mon spec with placement label:mon Feb 20 04:47:15 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Feb 20 04:47:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:47:15 localhost systemd[1]: var-lib-containers-storage-overlay-e6fc8ca174237947f2374abdf5b7141a8f1dec94af8e47ec8436ae8eb86cbfc8-merged.mount: Deactivated successfully. Feb 20 04:47:16 localhost podman[304300]: Feb 20 04:47:16 localhost podman[304300]: 2026-02-20 09:47:16.090776113 +0000 UTC m=+0.086343797 container create 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container) Feb 20 04:47:16 localhost systemd[1]: Started libpod-conmon-1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b.scope. Feb 20 04:47:16 localhost systemd[1]: Started libcrun container. Feb 20 04:47:16 localhost podman[304300]: 2026-02-20 09:47:16.053280013 +0000 UTC m=+0.048847757 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:47:16 localhost podman[304300]: 2026-02-20 09:47:16.166516814 +0000 UTC m=+0.162084498 container init 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, ceph=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:47:16 localhost podman[304300]: 2026-02-20 09:47:16.177354307 +0000 UTC m=+0.172922001 container start 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, release=1770267347) Feb 20 04:47:16 localhost podman[304300]: 2026-02-20 09:47:16.178362157 +0000 UTC m=+0.173929841 container attach 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git) Feb 20 04:47:16 localhost thirsty_easley[304315]: 167 167 Feb 20 04:47:16 localhost systemd[1]: libpod-1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b.scope: Deactivated successfully. Feb 20 04:47:16 localhost podman[304300]: 2026-02-20 09:47:16.182524174 +0000 UTC m=+0.178091888 container died 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:47:16 localhost podman[304320]: 2026-02-20 09:47:16.306537046 +0000 UTC m=+0.110445407 container remove 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, version=7, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2) Feb 20 04:47:16 localhost systemd[1]: libpod-conmon-1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b.scope: Deactivated successfully. Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:16 localhost ceph-mon[301857]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)... Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:16 localhost ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 20 04:47:16 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:16 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:47:16 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:47:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 20 04:47:16 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 20 04:47:16 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch Feb 20 04:47:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:16 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:16 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:47:16 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:47:16 localhost nova_compute[281288]: 2026-02-20 09:47:16.591 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:16 localhost systemd[1]: var-lib-containers-storage-overlay-4897af6e17dff831bcaf8282f1a1e9a9fe9122d30f753292ae307e1bf5b1a744-merged.mount: Deactivated successfully. Feb 20 04:47:17 localhost podman[304387]: Feb 20 04:47:17 localhost podman[304387]: 2026-02-20 09:47:17.098248709 +0000 UTC m=+0.077332590 container create b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 20 04:47:17 localhost systemd[1]: Started libpod-conmon-b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858.scope. Feb 20 04:47:17 localhost systemd[1]: Started libcrun container. Feb 20 04:47:17 localhost podman[304387]: 2026-02-20 09:47:17.166235983 +0000 UTC m=+0.145319854 container init b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, GIT_BRANCH=main, version=7, distribution-scope=public) Feb 20 04:47:17 localhost podman[304387]: 2026-02-20 09:47:17.067171847 +0000 UTC m=+0.046255788 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:47:17 localhost podman[304387]: 2026-02-20 09:47:17.17691522 +0000 UTC m=+0.155999101 container start b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:47:17 localhost podman[304387]: 2026-02-20 09:47:17.177342113 +0000 UTC m=+0.156426014 container attach b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, name=rhceph, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:47:17 localhost recursing_fermi[304402]: 167 167 Feb 20 04:47:17 localhost systemd[1]: libpod-b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858.scope: Deactivated successfully. Feb 20 04:47:17 localhost podman[304387]: 2026-02-20 09:47:17.181230413 +0000 UTC m=+0.160314344 container died b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.42.2, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:47:17 localhost ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44571 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625204", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 20 04:47:17 localhost podman[304407]: 2026-02-20 09:47:17.281844907 +0000 UTC m=+0.092242379 container remove b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, GIT_BRANCH=main, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.42.2, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 20 04:47:17 localhost systemd[1]: libpod-conmon-b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858.scope: Deactivated successfully. Feb 20 04:47:17 localhost podman[304413]: 2026-02-20 09:47:17.355075871 +0000 UTC m=+0.141844209 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, release=1770267347, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9) Feb 20 04:47:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:17 localhost podman[304413]: 2026-02-20 09:47:17.380143229 +0000 UTC m=+0.166911537 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal) Feb 20 04:47:17 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:47:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:17 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005625204 (monmap changed)... Feb 20 04:47:17 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005625204 (monmap changed)... Feb 20 04:47:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 20 04:47:17 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:47:17 localhost ceph-mon[301857]: Saving service mon spec with placement label:mon Feb 20 04:47:17 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:17 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:17 localhost ceph-mon[301857]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)... Feb 20 04:47:17 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:17 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 20 04:47:17 localhost ceph-mon[301857]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain Feb 20 04:47:17 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 20 04:47:17 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 20 04:47:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:17 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:17 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:47:17 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:47:17 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:17 localhost sshd[304479]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:47:17 localhost podman[241968]: time="2026-02-20T09:47:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:47:17 localhost podman[241968]: @ - - [20/Feb/2026:09:47:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:47:17 localhost podman[241968]: @ - - [20/Feb/2026:09:47:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1" Feb 20 04:47:17 localhost systemd[1]: tmp-crun.XiG12H.mount: Deactivated successfully. Feb 20 04:47:17 localhost systemd[1]: var-lib-containers-storage-overlay-98f0487996dcd35df07979c1af8c46dbe20aa15d7ab05f3530d709d4e5be6681-merged.mount: Deactivated successfully. Feb 20 04:47:18 localhost podman[304499]: Feb 20 04:47:18 localhost podman[304499]: 2026-02-20 09:47:18.111408 +0000 UTC m=+0.087088029 container create 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, build-date=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 20 04:47:18 localhost systemd[1]: Started libpod-conmon-576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853.scope. Feb 20 04:47:18 localhost systemd[1]: tmp-crun.OP2FU0.mount: Deactivated successfully. Feb 20 04:47:18 localhost podman[304499]: 2026-02-20 09:47:18.076288894 +0000 UTC m=+0.051968943 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 20 04:47:18 localhost systemd[1]: Started libcrun container. Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.208 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:47:18 localhost podman[304499]: 2026-02-20 09:47:18.208087473 +0000 UTC m=+0.183767512 container init 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.225 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.227 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb19a8e6-41fa-42df-a81d-688c9de2f66a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.209799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2509ea4a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': 'f12d86b6019598b37a075595676284b6dad07ec0ac125b6a343c1ff54b2d9b22'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.209799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '250a0566-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': '594f90f7da7b9e3d4074ddabb061a0dcf64ca427cbb4d178bc184549eb084099'}]}, 'timestamp': '2026-02-20 09:47:18.227541', '_unique_id': '6018991eb6e94cfcac6cb53640f8a739'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost sshd[304518]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:47:18 localhost podman[304499]: 2026-02-20 09:47:18.232144211 +0000 UTC m=+0.207824250 container start 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git) Feb 20 04:47:18 localhost podman[304499]: 2026-02-20 09:47:18.233715879 +0000 UTC m=+0.209395988 container attach 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:47:18 localhost focused_maxwell[304514]: 167 167 Feb 20 04:47:18 localhost systemd[1]: libpod-576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853.scope: Deactivated successfully. Feb 20 04:47:18 localhost podman[304499]: 2026-02-20 09:47:18.240104175 +0000 UTC m=+0.215784224 container died 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1770267347, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, version=7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.265 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e01c4fa4-ee64-4dec-a454-5f2dbf97afcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.232136', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '250fe922-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': 'a11bcde4b7038b95c42ae5ba7af2c527ab228c289ac8d784a8756f75df1a010a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.232136', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251002f4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '2a619bb0b430dd18636127eaf094560a289d4d216bb732f8df1d43a6e6f7e3b9'}]}, 'timestamp': '2026-02-20 09:47:18.266888', '_unique_id': 'ab4cad6b0d86413d9257afa0ee405233'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2df98f6-592c-43b5-826b-6da02db31eab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.270954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2510bde8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '37256375301243feb15a0c3c9a57b0f65dbc3e3945b0dfc44fceab65c00a833c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.270954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2510d8aa-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '5e6ba177d3c227669dfd67a6434540bc256a34c0ca6c7a80125aeb03f5196878'}]}, 'timestamp': '2026-02-20 09:47:18.272347', '_unique_id': '53de4422663244af85f71edbb8b21331'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.275 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9924a88-0846-4aaf-8606-a43f88ea8f19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.276151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2511885e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': '3efce7060e465043917abfa41f9fbaf4450f7f3b1379a1f8c62e84aaf480c97f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.276151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2511a258-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': 'd546d0f33524975e1230d99323942ee9956ea2f5d64f8e49e9527dfffff0d8f6'}]}, 'timestamp': '2026-02-20 09:47:18.277531', '_unique_id': '4e5f64e5a1664ec6a8afdaa521caf75d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.303 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 15210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78bc9af0-1641-4913-ad54-9cc58902d00b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15210000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:47:18.280849', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2515a286-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.542152542, 'message_signature': 'e3d962fc6af8f3a8e98efaee06301634f48f337572490904b73e1e38ff20bc32'}]}, 'timestamp': '2026-02-20 09:47:18.303785', '_unique_id': 'b5e2564ba7a64479ae3d80caa22badb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54bdbd29-2c89-4d38-b9c7-7357152ffc7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.306861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25163660-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '35c6e7bb730ce1bbbf486c1ca913182ddea7ea9b9329739bd9e4a315a16f26da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.306861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251663ba-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '141d77617591c736795afb06987ec46a7c6c5faf1278472b0cdf7837f4638437'}]}, 'timestamp': '2026-02-20 09:47:18.308689', '_unique_id': '7e2558dadffb4a0ba7c92be57916dda4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '154641d9-740b-4b37-a2b5-178532418b0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.311982', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '2517babc-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '6dd2581d3aabbd905799ce12bb0293d89a50fdd422c4762382f32b0ff256c372'}]}, 'timestamp': '2026-02-20 09:47:18.317526', '_unique_id': 'c632028817764670afc13e007ab98a7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdb3ba40-c2bc-4849-b67f-de2a8f26d794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.320524', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '25184fea-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'b72ed21b9aa5a1697a35bc0aafdb9cf97b14cb17cf2fa78c3af1516ec42e54d8'}]}, 'timestamp': '2026-02-20 09:47:18.321329', '_unique_id': '67d850235aa04cc9aa81d3b1a4efc174'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '118f1633-b444-4750-b1ab-95e7a6fbc910', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.325214', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251905f2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '9c0b4197ddf5f239c10dca0dbf3bf651291ed558bd53aac758b50d10a72442b0'}]}, 'timestamp': '2026-02-20 09:47:18.325994', '_unique_id': '1fb9ca672f264b5fa2c78ea804746565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ce00f25-61af-4009-84fa-cb900ded8e53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.329141', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '25199f76-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'b6848144d0c7a1f88d513973b289c6a8c8c576084ff0c95ae83ea3f62a75ce9a'}]}, 'timestamp': '2026-02-20 09:47:18.329929', '_unique_id': '04a96b61e93a47cd951706fd8a55e2d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.333 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3742ff5f-278b-42ec-a4cf-3a8263c3e4d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.333066', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251a388c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '283a7bf1cbe253ccaf858d62f31ef76e9d7153228a2750d7b6be1a2ac5824923'}]}, 'timestamp': '2026-02-20 09:47:18.333847', '_unique_id': '73ed27d201114bbc887c4f339c394db0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.336 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.337 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f990f3-a23c-41d8-8e63-8464ac4beec5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.337000', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251ad21a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'ca8d1c4baf7fed3560b5d8984e23406756e5a11cba655f3242b718cb5702c0e4'}]}, 'timestamp': '2026-02-20 09:47:18.337791', '_unique_id': 'adffd59b3afc4d319cc2c02fcd3c9b63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9529bba3-b1ed-4be6-8132-4bf144d83382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.341163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251b75a8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': '830696075ce24cdc19feea4d089d25ba095da4894ac58c83e5bc95814096beb1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.341163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251b9006-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': 'd933fbbeaab92dfe4b60b23e27682446114fe90f7ef79bafb20e6a766904c6b8'}]}, 'timestamp': '2026-02-20 09:47:18.342483', '_unique_id': '50a6752ca3414ea5a841e26b795a7c0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.344 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd48ee688-c1c2-43c2-890e-b1a4eb914646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.344615', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251bf67c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'ea75c244dd79e7b82a1378ec21a609449c14eb2d7a725ca1eb07c10e1b528fb5'}]}, 'timestamp': '2026-02-20 09:47:18.345071', '_unique_id': '4b0144e18d8d46f4a1101d60b1f98fdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.348 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.348 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64c2a4d2-0116-4117-b2ae-4df8833e90ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.348241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251c845c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '761747582922777899c5214d22268d85ae34496d2a2faad8d9fd824f1dbdbce8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.348241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251c98ac-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '6135ffe1cd9ffbf5fec225da4a86b90d1cedef5030f5a33851fd7e240e99e973'}]}, 'timestamp': '2026-02-20 09:47:18.349253', '_unique_id': 'c25b74f4f4974d85957e70e3204fe954'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fac47ff-53ca-43fb-a9b5-3203d849b81f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.351133', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251cf126-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'a4f10e7e3deea243c3575d25b0dbe038bdc3e8969d725c1ac899dbe3bda07af7'}]}, 'timestamp': '2026-02-20 09:47:18.351589', '_unique_id': '6f3c03a8099046568282010dbfa89ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.353 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfadfb57-222b-4438-9c8a-2e70058386b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.353208', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251d450e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'eb888c0db1b8554c4c529d1931769545ec1fb261f982026f52bebd76363b47e9'}]}, 'timestamp': '2026-02-20 09:47:18.353625', '_unique_id': 'a6804f177f6147ef934ca868c2e13e61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.355 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.356 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e664bbb-ace9-4b62-8bb5-b109322d1d58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.355722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251da95e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '4d7b94ae08f5d4afcac1005d256a4e23577ebb580b7a665ef2a0bd66ab14130a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.355722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251db9da-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': 'b3692d8b0e3a8ccb44bf9f1d6510f2af0170913c0932059322cd2342b236f304'}]}, 'timestamp': '2026-02-20 09:47:18.356653', '_unique_id': 'dc5b899f1e8f4c35a954f21d3bcd3b96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.358 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6452b0e5-a183-4197-bd7b-85c6daac0199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.358732', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251e1b96-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '0988c2bedf8245a84e795d14f593f3e7a9ea49d63a4abe807b02620b6cf62457'}]}, 'timestamp': '2026-02-20 09:47:18.359087', '_unique_id': 'cb01b56aefbe48159a4c8139975ee9f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.360 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4973c59-993d-4d15-ad63-fb05d2350231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.361127', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251e7a96-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '0af73b3b416ac3a230ab1301028490f89f1dc8d38b37bebaf93fcf5a94959a34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.361127', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251e8c8e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '957b92132113e09d2a9e679ccab2a9e95bc87bdbb8dbf020103efcb77d19a8ad'}]}, 'timestamp': '2026-02-20 09:47:18.361966', '_unique_id': 'd9f91f9576de4ec4869bf3d77a352ed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.364 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a208359-c6ec-4db1-a702-eef7c95d238c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:47:18.364251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '251ef2d2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.542152542, 'message_signature': 'c56d710976f336ce35f189ae75f30c88abc635f935e87b6c7293c5ac3e4353d4'}]}, 'timestamp': '2026-02-20 09:47:18.364583', '_unique_id': 'd02723f99e1c4aad82704c5842f1ad65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:47:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Feb 20 04:47:18 localhost podman[304521]: 2026-02-20 09:47:18.378683181 +0000 UTC m=+0.127154067 container remove 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 20 04:47:18 localhost systemd[1]: libpod-conmon-576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853.scope: Deactivated successfully. Feb 20 04:47:18 localhost ceph-mon[301857]: Reconfiguring mon.np0005625204 (monmap changed)... Feb 20 04:47:18 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:18 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:47:18 localhost ceph-mon[301857]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 20 04:47:18 localhost ceph-mgr[287186]: [progress INFO root] update: starting ev 3648edd9-f15d-4e63-81de-22936437324c (Updating node-proxy deployment (+3 -> 3)) Feb 20 04:47:18 localhost ceph-mgr[287186]: [progress INFO root] complete: finished ev 3648edd9-f15d-4e63-81de-22936437324c (Updating node-proxy deployment (+3 -> 3)) Feb 20 04:47:18 localhost ceph-mgr[287186]: [progress INFO root] Completed event 3648edd9-f15d-4e63-81de-22936437324c (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 20 04:47:18 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:47:18 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 20 04:47:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:18 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:18 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:47:18 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:47:18 localhost systemd[1]: tmp-crun.mMBt74.mount: Deactivated successfully. Feb 20 04:47:18 localhost systemd[1]: var-lib-containers-storage-overlay-8f7489e31821167baff6543526491bffd4e0075d2db1fdc8c7f8dc11ad648788-merged.mount: Deactivated successfully. Feb 20 04:47:19 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:19 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:47:19 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:19 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:19 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:19 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:47:19 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:19 localhost systemd[1]: session-70.scope: Deactivated successfully. Feb 20 04:47:19 localhost systemd[1]: session-70.scope: Consumed 1.764s CPU time. Feb 20 04:47:19 localhost systemd-logind[759]: Session 70 logged out. Waiting for processes to exit. Feb 20 04:47:19 localhost systemd-logind[759]: Removed session 70. Feb 20 04:47:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0) Feb 20 04:47:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0) Feb 20 04:47:19 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005625203 (monmap changed)... Feb 20 04:47:19 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005625203 (monmap changed)... Feb 20 04:47:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 20 04:47:19 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:47:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 20 04:47:19 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 20 04:47:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:47:19 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:47:19 localhost ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain Feb 20 04:47:19 localhost ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain Feb 20 04:47:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:20 localhost ceph-mon[301857]: Reconfiguring mon.np0005625202 (monmap changed)... Feb 20 04:47:20 localhost ceph-mon[301857]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain Feb 20 04:47:20 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:20 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 20 04:47:20 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0) Feb 20 04:47:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0) Feb 20 04:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:47:21 localhost podman[304556]: 2026-02-20 09:47:21.153815024 +0000 UTC m=+0.085020646 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:47:21 localhost podman[304555]: 2026-02-20 09:47:21.229520824 +0000 UTC m=+0.160533971 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:47:21 localhost podman[304556]: 2026-02-20 09:47:21.23947904 +0000 UTC m=+0.170684642 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:47:21 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:47:21 localhost podman[304555]: 2026-02-20 09:47:21.306183383 +0000 UTC m=+0.237196550 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:47:21 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:47:21 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:21 localhost ceph-mon[301857]: Reconfiguring mon.np0005625203 (monmap changed)... Feb 20 04:47:21 localhost ceph-mon[301857]: Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain Feb 20 04:47:21 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:21 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:21 localhost nova_compute[281288]: 2026-02-20 09:47:21.593 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:47:23 localhost ceph-mgr[287186]: [progress INFO root] Writing back 50 completed events Feb 20 04:47:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 20 04:47:23 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:24 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:47:24 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:25 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:26 localhost openstack_network_exporter[244414]: ERROR 09:47:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:47:26 localhost openstack_network_exporter[244414]: Feb 20 04:47:26 localhost openstack_network_exporter[244414]: ERROR 09:47:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:47:26 localhost openstack_network_exporter[244414]: Feb 20 04:47:26 localhost nova_compute[281288]: 2026-02-20 09:47:26.596 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:27 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:28 localhost ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections.. Feb 20 04:47:28 localhost ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: [] Feb 20 04:47:28 localhost ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections.. Feb 20 04:47:28 localhost ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: [] Feb 20 04:47:28 localhost ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections.. Feb 20 04:47:28 localhost ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: [] Feb 20 04:47:29 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:29 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 20 04:47:29 localhost systemd[299695]: Activating special unit Exit the Session... Feb 20 04:47:29 localhost systemd[299695]: Stopped target Main User Target. Feb 20 04:47:29 localhost systemd[299695]: Stopped target Basic System. Feb 20 04:47:29 localhost systemd[299695]: Stopped target Paths. Feb 20 04:47:29 localhost systemd[299695]: Stopped target Sockets. Feb 20 04:47:29 localhost systemd[299695]: Stopped target Timers. Feb 20 04:47:29 localhost systemd[299695]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 20 04:47:29 localhost systemd[299695]: Stopped Daily Cleanup of User's Temporary Directories. Feb 20 04:47:29 localhost systemd[299695]: Closed D-Bus User Message Bus Socket. Feb 20 04:47:29 localhost systemd[299695]: Stopped Create User's Volatile Files and Directories. Feb 20 04:47:29 localhost systemd[299695]: Removed slice User Application Slice. Feb 20 04:47:29 localhost systemd[299695]: Reached target Shutdown. Feb 20 04:47:29 localhost systemd[299695]: Finished Exit the Session. Feb 20 04:47:29 localhost systemd[299695]: Reached target Exit the Session. Feb 20 04:47:29 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 20 04:47:29 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 20 04:47:29 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 20 04:47:29 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 20 04:47:29 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 20 04:47:29 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 20 04:47:29 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 20 04:47:29 localhost systemd[1]: user-1003.slice: Consumed 2.386s CPU time. Feb 20 04:47:29 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:47:30 localhost podman[304599]: 2026-02-20 09:47:30.887937052 +0000 UTC m=+0.096866740 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:47:30 localhost podman[304599]: 2026-02-20 09:47:30.902443027 +0000 UTC m=+0.111372745 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:47:30 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:47:31 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:31 localhost nova_compute[281288]: 2026-02-20 09:47:31.600 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:33 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:34 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:35 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:36 localhost nova_compute[281288]: 2026-02-20 09:47:36.602 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:47:37 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:39 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:39 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:47:40 localhost podman[304618]: 2026-02-20 09:47:40.149399955 +0000 UTC m=+0.086249895 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:47:40 localhost podman[304618]: 2026-02-20 09:47:40.163082104 +0000 UTC m=+0.099932034 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:47:40 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:47:41 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:41 localhost nova_compute[281288]: 2026-02-20 09:47:41.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:43 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:44 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:45 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:46 localhost sshd[304642]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:47:46 localhost podman[304643]: 2026-02-20 09:47:46.140691773 +0000 UTC m=+0.073269366 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:47:46 localhost podman[304643]: 2026-02-20 09:47:46.17419064 +0000 UTC m=+0.106768193 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:47:46 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:47:46 localhost nova_compute[281288]: 2026-02-20 09:47:46.607 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:47:46 localhost nova_compute[281288]: 2026-02-20 09:47:46.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:47:46 localhost nova_compute[281288]: 2026-02-20 09:47:46.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:47:46 localhost nova_compute[281288]: 2026-02-20 09:47:46.610 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:47:46 localhost nova_compute[281288]: 2026-02-20 09:47:46.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:46 localhost nova_compute[281288]: 2026-02-20 09:47:46.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:47:47 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:47 localhost podman[241968]: time="2026-02-20T09:47:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:47:47 localhost podman[241968]: @ - - [20/Feb/2026:09:47:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:47:47 localhost podman[241968]: @ - - [20/Feb/2026:09:47:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1" Feb 20 04:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:47:48 localhost systemd[1]: tmp-crun.2J8KES.mount: Deactivated successfully. Feb 20 04:47:48 localhost podman[304667]: 2026-02-20 09:47:48.155250025 +0000 UTC m=+0.091650340 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:47:48 localhost podman[304667]: 2026-02-20 09:47:48.172117472 +0000 UTC m=+0.108517787 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible) Feb 20 04:47:48 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:47:48 localhost nova_compute[281288]: 2026-02-20 09:47:48.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:48 localhost nova_compute[281288]: 2026-02-20 09:47:48.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:47:48 localhost nova_compute[281288]: 2026-02-20 09:47:48.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:47:48 localhost nova_compute[281288]: 2026-02-20 09:47:48.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:47:48 localhost nova_compute[281288]: 2026-02-20 09:47:48.742 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:47:48 localhost nova_compute[281288]: 2026-02-20 09:47:48.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:47:49 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:47:49 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3781215475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.201 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.276 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.277 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.511 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:47:49 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.514 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11726MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.515 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.516 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.602 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:47:49 localhost nova_compute[281288]: 2026-02-20 09:47:49.702 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:47:49 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:47:50 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/190951353' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:47:50 localhost nova_compute[281288]: 2026-02-20 09:47:50.161 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:47:50 localhost nova_compute[281288]: 2026-02-20 09:47:50.168 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:47:50 localhost nova_compute[281288]: 2026-02-20 09:47:50.191 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:47:50 localhost nova_compute[281288]: 2026-02-20 09:47:50.194 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:47:50 localhost nova_compute[281288]: 2026-02-20 09:47:50.194 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:47:51 localhost nova_compute[281288]: 2026-02-20 09:47:51.195 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:51 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:51 localhost nova_compute[281288]: 2026-02-20 09:47:51.612 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:51 localhost nova_compute[281288]: 2026-02-20 09:47:51.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:51 localhost nova_compute[281288]: 2026-02-20 09:47:51.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:47:52 localhost podman[304732]: 2026-02-20 09:47:52.138862524 +0000 UTC m=+0.077128525 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:47:52 localhost systemd[1]: tmp-crun.XJaV3b.mount: Deactivated successfully. Feb 20 04:47:52 localhost podman[304732]: 2026-02-20 09:47:52.212989016 +0000 UTC m=+0.151254997 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:47:52 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:47:52 localhost podman[304733]: 2026-02-20 09:47:52.218194205 +0000 UTC m=+0.150643428 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Feb 20 04:47:52 localhost podman[304733]: 2026-02-20 09:47:52.303233631 +0000 UTC m=+0.235682884 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible) Feb 20 04:47:52 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:47:52 localhost nova_compute[281288]: 2026-02-20 09:47:52.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:52 localhost nova_compute[281288]: 2026-02-20 09:47:52.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:52 localhost nova_compute[281288]: 2026-02-20 09:47:52.747 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:52 localhost nova_compute[281288]: 2026-02-20 09:47:52.747 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:53 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:53 localhost nova_compute[281288]: 2026-02-20 09:47:53.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:53 localhost nova_compute[281288]: 2026-02-20 09:47:53.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:47:53 localhost nova_compute[281288]: 2026-02-20 09:47:53.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.204 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.205 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.205 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.205 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.602 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.620 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.620 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.621 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:47:54 localhost nova_compute[281288]: 2026-02-20 09:47:54.622 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:47:54 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:47:55 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:56 localhost openstack_network_exporter[244414]: ERROR 09:47:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:47:56 localhost openstack_network_exporter[244414]: Feb 20 04:47:56 localhost openstack_network_exporter[244414]: ERROR 09:47:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:47:56 localhost openstack_network_exporter[244414]: Feb 20 04:47:56 localhost nova_compute[281288]: 2026-02-20 09:47:56.614 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:56 localhost nova_compute[281288]: 2026-02-20 09:47:56.618 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:47:57 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:58 localhost ceph-mgr[287186]: [balancer INFO root] Optimize plan auto_2026-02-20_09:47:58 Feb 20 04:47:58 localhost ceph-mgr[287186]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Feb 20 04:47:58 localhost ceph-mgr[287186]: [balancer INFO root] do_upmap Feb 20 04:47:58 localhost ceph-mgr[287186]: [balancer INFO root] pools ['.mgr', 'manila_metadata', 'volumes', 'manila_data', 'images', 'backups', 'vms'] Feb 20 04:47:58 localhost ceph-mgr[287186]: [balancer INFO root] prepared 0/10 changes Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] _maybe_adjust Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32) Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 20 04:47:58 localhost ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Feb 20 04:47:58 localhost ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections.. Feb 20 04:47:58 localhost ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: [] Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections.. Feb 20 04:47:58 localhost ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: [] Feb 20 04:47:58 localhost ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections.. Feb 20 04:47:58 localhost ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: [] Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after= Feb 20 04:47:58 localhost ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 20 04:47:59 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:47:59 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:48:01 localhost sshd[304788]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:48:01 localhost podman[304775]: 2026-02-20 09:48:01.147472266 +0000 UTC m=+0.086699038 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:48:01 localhost podman[304775]: 2026-02-20 09:48:01.184805761 +0000 UTC m=+0.124032493 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:48:01 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:48:01 localhost ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44589 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 20 04:48:01 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:01 localhost nova_compute[281288]: 2026-02-20 09:48:01.616 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:01 localhost nova_compute[281288]: 2026-02-20 09:48:01.621 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:03 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:05 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:48:06.011 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:48:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:48:06.012 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:48:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:48:06.012 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:48:06 localhost nova_compute[281288]: 2026-02-20 09:48:06.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:07 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:09 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:48:11 localhost podman[304796]: 2026-02-20 09:48:11.139683636 +0000 UTC m=+0.082599862 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:48:11 localhost podman[304796]: 2026-02-20 09:48:11.175666549 +0000 UTC m=+0.118582735 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:48:11 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:48:11 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:11 localhost nova_compute[281288]: 2026-02-20 09:48:11.622 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:13 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:14 localhost ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44610 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 20 04:48:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:15 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:16 localhost nova_compute[281288]: 2026-02-20 09:48:16.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:48:17 localhost podman[304817]: 2026-02-20 09:48:17.149706239 +0000 UTC m=+0.089245585 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:48:17 localhost podman[304817]: 2026-02-20 09:48:17.183851457 +0000 UTC m=+0.123390813 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:48:17 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:48:17 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:17 localhost podman[241968]: time="2026-02-20T09:48:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:48:17 localhost podman[241968]: @ - - [20/Feb/2026:09:48:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:48:17 localhost podman[241968]: @ - - [20/Feb/2026:09:48:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18294 "" "Go-http-client/1.1" Feb 20 04:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:48:19 localhost podman[304841]: 2026-02-20 09:48:19.140506193 +0000 UTC m=+0.082790748 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git) Feb 20 04:48:19 localhost podman[304841]: 2026-02-20 09:48:19.153239474 +0000 UTC m=+0.095524039 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7) Feb 20 04:48:19 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:48:19 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:21 localhost ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail Feb 20 04:48:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 20 04:48:21 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 20 04:48:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 20 04:48:21 localhost ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:48:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 20 04:48:21 localhost ceph-mgr[287186]: [progress INFO root] update: starting ev f0532645-8d13-4eb4-a222-42dd4abbb158 (Updating node-proxy deployment (+3 -> 3)) Feb 20 04:48:21 localhost ceph-mgr[287186]: [progress INFO root] complete: finished ev f0532645-8d13-4eb4-a222-42dd4abbb158 (Updating node-proxy deployment (+3 -> 3)) Feb 20 04:48:21 localhost ceph-mgr[287186]: [progress INFO root] Completed event f0532645-8d13-4eb4-a222-42dd4abbb158 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 20 04:48:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 20 04:48:21 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 20 04:48:21 localhost nova_compute[281288]: 2026-02-20 09:48:21.627 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:22 localhost ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:48:22 localhost ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' Feb 20 04:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:48:23 localhost podman[304947]: 2026-02-20 09:48:23.150843051 +0000 UTC m=+0.084046077 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:48:23 localhost podman[304947]: 2026-02-20 09:48:23.225354084 +0000 UTC m=+0.158557090 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:48:23 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:48:23 localhost podman[304948]: 2026-02-20 09:48:23.228843592 +0000 UTC m=+0.156912160 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:48:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 e91: 6 total, 6 up, 6 in Feb 20 04:48:23 localhost podman[304948]: 2026-02-20 09:48:23.309360329 +0000 UTC m=+0.237428867 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 04:48:23 localhost ceph-mgr[287186]: mgr handle_mgr_map I was active but no longer am Feb 20 04:48:23 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:23.314+0000 7f0941015640 -1 mgr handle_mgr_map I was active but no longer am Feb 20 04:48:23 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:48:23 localhost systemd[1]: session-72.scope: Deactivated successfully. Feb 20 04:48:23 localhost systemd[1]: session-72.scope: Consumed 11.972s CPU time. Feb 20 04:48:23 localhost systemd-logind[759]: Session 72 logged out. Waiting for processes to exit. Feb 20 04:48:23 localhost systemd-logind[759]: Removed session 72. Feb 20 04:48:23 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: ignoring --setuser ceph since I am not root Feb 20 04:48:23 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: ignoring --setgroup ceph since I am not root Feb 20 04:48:23 localhost ceph-mgr[287186]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2 Feb 20 04:48:23 localhost ceph-mgr[287186]: pidfile_write: ignore empty --pid-file Feb 20 04:48:23 localhost ceph-mon[301857]: from='client.? 172.18.0.200:0/2835510203' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 20 04:48:23 localhost ceph-mon[301857]: Activating manager daemon np0005625202.arwxwo Feb 20 04:48:23 localhost ceph-mon[301857]: from='client.? 172.18.0.200:0/2835510203' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 20 04:48:23 localhost ceph-mon[301857]: Manager daemon np0005625202.arwxwo is now available Feb 20 04:48:23 localhost ceph-mgr[287186]: mgr[py] Loading python module 'alerts' Feb 20 04:48:23 localhost ceph-mgr[287186]: mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 20 04:48:23 localhost ceph-mgr[287186]: mgr[py] Loading python module 'balancer' Feb 20 04:48:23 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:23.550+0000 7f64ba62c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 20 04:48:23 localhost sshd[305012]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:48:23 localhost ceph-mgr[287186]: mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 20 04:48:23 localhost ceph-mgr[287186]: mgr[py] Loading python module 'cephadm' Feb 20 04:48:23 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:23.626+0000 7f64ba62c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 20 04:48:23 localhost systemd-logind[759]: New session 73 of user ceph-admin. Feb 20 04:48:23 localhost systemd[1]: Started Session 73 of User ceph-admin. Feb 20 04:48:24 localhost ceph-mgr[287186]: mgr[py] Loading python module 'crash' Feb 20 04:48:24 localhost ceph-mgr[287186]: mgr[py] Module crash has missing NOTIFY_TYPES member Feb 20 04:48:24 localhost ceph-mgr[287186]: mgr[py] Loading python module 'dashboard' Feb 20 04:48:24 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:24.366+0000 7f64ba62c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Feb 20 04:48:24 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/mirror_snapshot_schedule"} : dispatch Feb 20 04:48:24 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/trash_purge_schedule"} : dispatch Feb 20 04:48:24 localhost sshd[305121]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:48:24 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:24 localhost systemd[1]: tmp-crun.WZZuTt.mount: Deactivated successfully. Feb 20 04:48:24 localhost podman[305130]: 2026-02-20 09:48:24.87777572 +0000 UTC m=+0.114838010 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1770267347, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main) Feb 20 04:48:24 localhost ceph-mgr[287186]: mgr[py] Loading python module 'devicehealth' Feb 20 04:48:24 localhost podman[305130]: 2026-02-20 09:48:24.975809657 +0000 UTC m=+0.212871897 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vendor=Red Hat, Inc., name=rhceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 20 04:48:24 localhost ceph-mgr[287186]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 20 04:48:24 localhost ceph-mgr[287186]: mgr[py] Loading python module 'diskprediction_local' Feb 20 04:48:24 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:24.989+0000 7f64ba62c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Feb 20 04:48:25 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Feb 20 04:48:25 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: from numpy import show_config as show_numpy_config Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'influx' Feb 20 04:48:25 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:25.148+0000 7f64ba62c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Module influx has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'insights' Feb 20 04:48:25 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:25.215+0000 7f64ba62c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'iostat' Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'k8sevents' Feb 20 04:48:25 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:25.342+0000 7f64ba62c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'localpool' Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'mds_autoscaler' Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'mirroring' Feb 20 04:48:25 localhost ceph-mgr[287186]: mgr[py] Loading python module 'nfs' Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'orchestrator' Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.167+0000 7f64ba62c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.321+0000 7f64ba62c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'osd_perf_query' Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.387+0000 7f64ba62c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'osd_support' Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.444+0000 7f64ba62c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'pg_autoscaler' Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'progress' Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.512+0000 7f64ba62c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost openstack_network_exporter[244414]: ERROR 09:48:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:48:26 localhost openstack_network_exporter[244414]: Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module progress has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost openstack_network_exporter[244414]: ERROR 09:48:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:48:26 localhost openstack_network_exporter[244414]: Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'prometheus' Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.572+0000 7f64ba62c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost nova_compute[281288]: 2026-02-20 09:48:26.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'rbd_support' Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.871+0000 7f64ba62c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mgr[287186]: mgr[py] Loading python module 'restful' Feb 20 04:48:26 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.956+0000 7f64ba62c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 20 04:48:26 localhost ceph-mon[301857]: [20/Feb/2026:09:48:24] ENGINE Bus STARTING Feb 20 04:48:26 localhost ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Serving on http://172.18.0.106:8765 Feb 20 04:48:26 localhost ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Serving on https://172.18.0.106:7150 Feb 20 04:48:26 localhost ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Client ('172.18.0.106', 54518) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 20 04:48:26 localhost ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Bus STARTED Feb 20 04:48:26 localhost ceph-mon[301857]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 20 04:48:26 localhost ceph-mon[301857]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 20 04:48:26 localhost ceph-mon[301857]: Cluster is now healthy Feb 20 04:48:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'rgw' Feb 20 04:48:27 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.288+0000 7f64ba62c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'rook' Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Module rook has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.699+0000 7f64ba62c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'selftest' Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.769+0000 7f64ba62c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'snap_schedule' Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'stats' Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'status' Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Module status has missing NOTIFY_TYPES member Feb 20 04:48:27 localhost ceph-mgr[287186]: mgr[py] Loading python module 'telegraf' Feb 20 04:48:27 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.969+0000 7f64ba62c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Loading python module 'telemetry' Feb 20 04:48:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.027+0000 7f64ba62c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Loading python module 'test_orchestrator' Feb 20 04:48:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.162+0000 7f64ba62c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Loading python module 'volumes' Feb 20 04:48:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.310+0000 7f64ba62c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Loading python module 'zabbix' Feb 20 04:48:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.505+0000 7f64ba62c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:48:28 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:48:28 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:48:28 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:48:28 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:48:28 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf Feb 20 04:48:28 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf Feb 20 04:48:28 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf Feb 20 04:48:28 localhost ceph-mgr[287186]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.565+0000 7f64ba62c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 20 04:48:28 localhost ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x559ff0eb7600 mon_map magic: 0 from mon.2 v2:172.18.0.105:3300/0 Feb 20 04:48:28 localhost ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.106:6810/1895764224 Feb 20 04:48:29 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:48:29 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:48:29 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf Feb 20 04:48:29 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:48:29 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:30 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:48:30 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 20 04:48:30 localhost ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:48:30 localhost ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:48:30 localhost ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:31 localhost nova_compute[281288]: 2026-02-20 09:48:31.633 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:48:31 localhost nova_compute[281288]: 2026-02-20 09:48:31.635 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:31 localhost nova_compute[281288]: 2026-02-20 09:48:31.635 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:48:31 localhost nova_compute[281288]: 2026-02-20 09:48:31.635 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:48:31 localhost nova_compute[281288]: 2026-02-20 09:48:31.636 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:48:31 localhost nova_compute[281288]: 2026-02-20 09:48:31.638 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:31 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:48:31 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.927869) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580911928831, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2719, "num_deletes": 256, "total_data_size": 8670897, "memory_usage": 8956368, "flush_reason": "Manual Compaction"} Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580911953979, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5220534, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11866, "largest_seqno": 14580, "table_properties": {"data_size": 5209498, "index_size": 6901, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27059, "raw_average_key_size": 22, "raw_value_size": 5185912, "raw_average_value_size": 4264, "num_data_blocks": 299, "num_entries": 1216, "num_filter_entries": 1216, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580813, "oldest_key_time": 1771580813, "file_creation_time": 1771580911, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 26200 microseconds, and 11901 cpu microseconds. Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.954056) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5220534 bytes OK Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.954100) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.956556) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.956584) EVENT_LOG_v1 {"time_micros": 1771580911956577, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.956618) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 8657884, prev total WAL file size 8657884, number of live WAL files 2. Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.961916) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5098KB)], [15(15MB)] Feb 20 04:48:31 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580911961979, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 21589879, "oldest_snapshot_seqno": -1} Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11901 keys, 18468504 bytes, temperature: kUnknown Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580912041558, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 18468504, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18399782, "index_size": 37901, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 319044, "raw_average_key_size": 26, "raw_value_size": 18196238, "raw_average_value_size": 1528, "num_data_blocks": 1448, "num_entries": 11901, "num_filter_entries": 11901, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771580911, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.042087) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 18468504 bytes Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.043932) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 270.6 rd, 231.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.0, 15.6 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 12441, records dropped: 540 output_compression: NoCompression Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.043966) EVENT_LOG_v1 {"time_micros": 1771580912043950, "job": 6, "event": "compaction_finished", "compaction_time_micros": 79788, "compaction_time_cpu_micros": 50187, "output_level": 6, "num_output_files": 1, "total_output_size": 18468504, "num_input_records": 12441, "num_output_records": 11901, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580912044960, "job": 6, "event": "table_file_deletion", "file_number": 17} Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580912048211, "job": 6, "event": "table_file_deletion", "file_number": 15} Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.961772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048358) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:48:32 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:48:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:48:32 localhost podman[306049]: 2026-02-20 09:48:32.176829469 +0000 UTC m=+0.097832942 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 20 04:48:32 localhost podman[306049]: 2026-02-20 09:48:32.216188404 +0000 UTC m=+0.137191907 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:48:32 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:48:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:48:34 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:36 localhost nova_compute[281288]: 2026-02-20 09:48:36.637 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:39 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:41 localhost nova_compute[281288]: 2026-02-20 09:48:41.639 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:48:42 localhost systemd[292696]: Created slice User Background Tasks Slice. Feb 20 04:48:42 localhost systemd[292696]: Starting Cleanup of User's Temporary Files and Directories... Feb 20 04:48:42 localhost podman[306067]: 2026-02-20 09:48:42.768726659 +0000 UTC m=+0.705504707 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:48:42 localhost systemd[292696]: Finished Cleanup of User's Temporary Files and Directories. Feb 20 04:48:42 localhost podman[306067]: 2026-02-20 09:48:42.801150916 +0000 UTC m=+0.737928954 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:48:42 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:48:43 localhost sshd[306090]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:48:44 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:46 localhost nova_compute[281288]: 2026-02-20 09:48:46.642 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:47 localhost podman[241968]: time="2026-02-20T09:48:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:48:47 localhost podman[241968]: @ - - [20/Feb/2026:09:48:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:48:47 localhost podman[241968]: @ - - [20/Feb/2026:09:48:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18293 "" "Go-http-client/1.1" Feb 20 04:48:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:48:48 localhost podman[306092]: 2026-02-20 09:48:48.136971593 +0000 UTC m=+0.076393862 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:48:48 localhost podman[306092]: 2026-02-20 09:48:48.148112076 +0000 UTC m=+0.087534365 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:48:48 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.744 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:48:49 localhost nova_compute[281288]: 2026-02-20 09:48:49.744 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:48:49 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:48:50 localhost podman[306138]: 2026-02-20 09:48:50.144933489 +0000 UTC m=+0.079249668 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc.) Feb 20 04:48:50 localhost podman[306138]: 2026-02-20 09:48:50.158476422 +0000 UTC m=+0.092792641 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, version=9.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.) Feb 20 04:48:50 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:48:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:48:50 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/918867819' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.216 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.296 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.297 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.518 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.520 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11778MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.521 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.521 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.592 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.592 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.593 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:48:50 localhost nova_compute[281288]: 2026-02-20 09:48:50.631 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:48:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:48:51 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/825169528' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.124 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.132 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.226 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.229 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.229 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.646 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.646 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.647 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:48:51 localhost nova_compute[281288]: 2026-02-20 09:48:51.651 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:48:54 localhost systemd[1]: tmp-crun.ihY7mx.mount: Deactivated successfully. Feb 20 04:48:54 localhost podman[306182]: 2026-02-20 09:48:54.140826403 +0000 UTC m=+0.076886986 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true) Feb 20 04:48:54 localhost systemd[1]: tmp-crun.FaTQhF.mount: Deactivated successfully. Feb 20 04:48:54 localhost podman[306183]: 2026-02-20 09:48:54.163924983 +0000 UTC m=+0.097340507 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.225 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.227 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:54 localhost podman[306183]: 2026-02-20 09:48:54.249289372 +0000 UTC m=+0.182704936 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 20 04:48:54 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:48:54 localhost podman[306182]: 2026-02-20 09:48:54.265014291 +0000 UTC m=+0.201074814 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:48:54 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:48:54 localhost nova_compute[281288]: 2026-02-20 09:48:54.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:48:54 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.219 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.220 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.220 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.221 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.732 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.752 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.752 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.753 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:48:55 localhost nova_compute[281288]: 2026-02-20 09:48:55.753 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:48:56 localhost openstack_network_exporter[244414]: ERROR 09:48:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:48:56 localhost openstack_network_exporter[244414]: Feb 20 04:48:56 localhost openstack_network_exporter[244414]: ERROR 09:48:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:48:56 localhost openstack_network_exporter[244414]: Feb 20 04:48:56 localhost nova_compute[281288]: 2026-02-20 09:48:56.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:56 localhost nova_compute[281288]: 2026-02-20 09:48:56.652 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:48:59 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:00 localhost sshd[306225]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:49:01 localhost nova_compute[281288]: 2026-02-20 09:49:01.651 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:49:03 localhost podman[306227]: 2026-02-20 09:49:03.156418368 +0000 UTC m=+0.089857794 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 20 04:49:03 localhost podman[306227]: 2026-02-20 09:49:03.172582081 +0000 UTC m=+0.106021437 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:49:03 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:49:04 localhost sshd[306246]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:49:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:49:06.012 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:49:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:49:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:49:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:49:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:49:06 localhost nova_compute[281288]: 2026-02-20 09:49:06.654 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:11 localhost nova_compute[281288]: 2026-02-20 09:49:11.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:49:13 localhost podman[306248]: 2026-02-20 09:49:13.129299389 +0000 UTC m=+0.068851906 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:49:13 localhost podman[306248]: 2026-02-20 09:49:13.137814844 +0000 UTC m=+0.077367391 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:49:13 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:49:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:16 localhost nova_compute[281288]: 2026-02-20 09:49:16.660 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:17 localhost podman[241968]: time="2026-02-20T09:49:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:49:17 localhost podman[241968]: @ - - [20/Feb/2026:09:49:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:49:17 localhost podman[241968]: @ - - [20/Feb/2026:09:49:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18295 "" "Go-http-client/1.1" Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.207 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.234 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.235 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9153d565-cea2-443b-a551-f9356b5deb53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.208748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c91c4be-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '0b8d8f781f93177c68d2aac88f31e1c471d8fe5d6afb33997ea911f3d443ddce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.208748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c91d4ae-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '7bf7da62ffac372a41aa8b06cfb31abe0a7b04c7226c28da3adf6a04afe6652d'}]}, 'timestamp': '2026-02-20 09:49:18.235794', '_unique_id': 'bbbec8174e974a1e9b9ba70bbac7dd70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.237 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.238 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b09932f-8881-420d-9c0e-b29eaf21b0b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.237829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c923598-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '2b125be1ee260173351675c9761406b0eac063bda3d51385e93fc64206333fd4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.237829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c924556-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '6932083c680ddcd2697066784f85dbba65a2d1250a57cbd1ce956e722cfd0db3'}]}, 'timestamp': '2026-02-20 09:49:18.238673', '_unique_id': 'e058ee5d94cf462dba3cdce202f8491a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.244 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '580bb93f-c9b7-423a-b0fd-948c9ec8abb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.240716', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9343ca-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '92dc00050e3d2a8447f7e873561a0c6974efd7b6daa8d9db26df3079f6349d6b'}]}, 'timestamp': '2026-02-20 09:49:18.245116', '_unique_id': 'ee1723eccb6242f0a7acd48bc621b2e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.263 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 15800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ae4a6ce-1391-44eb-9918-f2fdfc578cc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15800000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:49:18.246479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6c96360c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.502865791, 'message_signature': '2e5dd20065c21d61d209b1f7f0f236121d9491f088a4d9cc193139382cac5534'}]}, 'timestamp': '2026-02-20 09:49:18.264569', '_unique_id': '726d96870fa541ec9328706e261f4631'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51dbc5ae-bfc8-406a-a7f2-61f7e1c2c6ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.267326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c96b6a4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '083da58d8722f6fd30dbf6e3fbab1ff04fabb7fffcf833a9d1cfdbc825fe2f37'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.267326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c96c892-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '791a7d506aae2e382f00b74b8114aa5cf33b0ebcf44f44c4228d7aee047e7a84'}]}, 'timestamp': '2026-02-20 09:49:18.268230', '_unique_id': '27f7d8782acc4194b3d26cba9338f034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.270 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c2cd70e-93d0-41b8-8496-c8aa34ada277', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.270409', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c972fee-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'ce2148c854ef5a44483eea4496078eefddbd39e391788febb8a031f8a2d6b997'}]}, 'timestamp': '2026-02-20 09:49:18.270906', '_unique_id': '3e9cc1f7a2974759bad3e43af9d96cfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.272 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.273 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '549cd643-09a9-4331-9fef-9421ee99a5cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.273020', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9794a2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'f8c3a77bd0d892b39caf7ff2aafc32f10a6bce68eaeb74a0be0a6ef73e1aac77'}]}, 'timestamp': '2026-02-20 09:49:18.273478', '_unique_id': 'e55452a4d75142308bdf97a708106822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.275 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f13408c-ec8a-475c-a416-4c00db345aff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.275598', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c97fa8c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'fb0be74a5abd238be6c6187ccfad50b65e29ad8727da8847bc2256a1995f106e'}]}, 'timestamp': '2026-02-20 09:49:18.276087', '_unique_id': '0b5ba298f1b74d9b998da84cef5cc418'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61ec3dab-db04-42c2-8f15-36725250ef03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.278515', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c986d14-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'fac0663a8752a1ac6b06e62c99e9729be9a401e712dd91abe5252b3dac8bd6b8'}]}, 'timestamp': '2026-02-20 09:49:18.279036', '_unique_id': 'c15b992621ff4b8cace54963a6643fc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.291 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b5e42ee-3c0b-4466-b558-18e837b888e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.281146', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9a72da-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': 'b97b762e1e5727b81c1d5e4af9e68090d858458c4df3d07c903b8d2f215e08e7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.281146', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9a8572-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': 'b0806a3e25d936ead7c5093668c95d10af29459ad3a95ff096ae0e9e8ad21927'}]}, 'timestamp': '2026-02-20 09:49:18.292765', '_unique_id': 'c03dc62dd0b944d4b9a4a9cc0e40cfac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.295 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd02d59a1-c068-4aa3-9db2-df0dacd98d18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:49:18.295365', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6c9afd90-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.502865791, 'message_signature': 'd2621df43352528d294302bfedc298f575ab5acd5e739ac0c08864ccaa8fb90b'}]}, 'timestamp': '2026-02-20 09:49:18.295836', '_unique_id': '70a4e4820ed149d88e97d5eda5eb21d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.298 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '115920db-b60d-4fd6-8b0c-74ad20f29a75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.298543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9b7afe-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '66f94a6bb81db6d77a21ac7804efdcbf189714a6872105aaadad6982052ad7ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.298543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9b8b16-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': 'a32485268c150bd933ad09fc3c87161c7c931ab6a96b2ad145c56446f8daa464'}]}, 'timestamp': '2026-02-20 09:49:18.299451', '_unique_id': '301914c38b6d4e85a0476aecc0593d75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '487f7958-56d0-4ab7-a0fe-09877f476ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.301568', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9bf132-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '8d93a5b83e53040e64c9e520766fa8b56da787ea9fcb201fb363b9cfacae0444'}]}, 'timestamp': '2026-02-20 09:49:18.302061', '_unique_id': '26b857e38d73424d8d06981911415f79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '057082e9-764b-4065-a9f8-7515c4dd2255', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.304143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9c53f2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': '4195a9a941c32752f471fdf71b3442e36e0be8198f35cede9dfe27087df44317'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.304143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9c6518-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': 'c7acbad9aaea2b2e386b090ac531a39f00541f7d6ebaef8960d28ac4170100cb'}]}, 'timestamp': '2026-02-20 09:49:18.305001', '_unique_id': '9a0bc337a5294a6480719f518f1a2bde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.307 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e7820c9-a46a-4d0a-a169-e64f3c607d4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.307289', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9ccf58-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '0e1af3938816ad6aa2ccc83f07ad23dddd79adff69b3516cb7d0afaf8c2c6006'}]}, 'timestamp': '2026-02-20 09:49:18.307779', '_unique_id': '37cfedd0a3fb44428b949f195430ddaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.310 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d998d51-87b0-487f-b4fe-587638f8345e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.309902', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9d34f2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': '3d3087dd44bef3a9299bda9bcf4182613e2bec736b53fc2b98daa8e4771b5b3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.309902', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9d44ec-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': '40cf9c4ea41b02da3d538481d74198539f131b0ca7d8f5835eee8d67375d2cc2'}]}, 'timestamp': '2026-02-20 09:49:18.310758', '_unique_id': '76e19bf3857a493db4f0a1872c6ee227'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.312 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '221cfeee-1735-42c4-9df2-3eff736a1da3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.313062', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9db094-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '631832dbdb8aac4c786003e2d4edcd99ca318d044436164efada5be900da45c2'}]}, 'timestamp': '2026-02-20 09:49:18.313516', '_unique_id': '0000feb62b52433ea4bf9897f0024477'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.315 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.315 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32f342fc-241e-4c1c-ba83-6609b4496d89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.315581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9e13ea-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': 'a6e42984c09e650aabdb39a28360faf88b09553a0b46e9e4680cfab5cef5e62a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.315581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9e1e8a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '1e57c6bb30f617259e72e10424b63a4320d8718214ca158c252080b437490973'}]}, 'timestamp': '2026-02-20 09:49:18.316231', '_unique_id': '032e68c5f5b5400eb6e023ac50896179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.317 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9b92159-9db7-47a6-a700-f50d56803003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.317743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9e644e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '2c5e3a6282aee253099de6b7cfc53dff3a1e238faca11905926ed73ce1bccb20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.317743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9e6e94-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': 'c705231169b01982323a8a99824b1145f179a61f3591f40042deb3e405a91547'}]}, 'timestamp': '2026-02-20 09:49:18.318279', '_unique_id': '03a78aaa904b44a183e6d4048b3db869'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae135d58-f63e-4512-96a5-f96018c88480', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.319706', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9eb16a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '5be8a17acbd60a41b0b9f339f7ad2074176fb5f74896bf8094efb154d68c0236'}]}, 'timestamp': '2026-02-20 09:49:18.320011', '_unique_id': '55949289ee9b43f7aecefee9bade305f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '792f3301-bf55-487e-82b2-3f55478d166d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.321534', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9ef95e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '6eef2927f9afe77db1a657ebebdfda84cebdedc1b3603d8c3cb262347a6ebd1a'}]}, 'timestamp': '2026-02-20 09:49:18.321853', '_unique_id': '6836aa72ee1a4627a0f2bd88b5187376'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:49:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:49:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:49:19 localhost podman[306271]: 2026-02-20 09:49:19.156195641 +0000 UTC m=+0.092056300 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:49:19 localhost podman[306271]: 2026-02-20 09:49:19.168306482 +0000 UTC m=+0.104167131 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:49:19 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:49:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:49:21 localhost podman[306296]: 2026-02-20 09:49:21.137160004 +0000 UTC m=+0.077217587 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:49:21 localhost podman[306296]: 2026-02-20 09:49:21.145187324 +0000 UTC m=+0.085244727 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.7, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:49:21 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:49:21 localhost nova_compute[281288]: 2026-02-20 09:49:21.663 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:49:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:49:25 localhost podman[306316]: 2026-02-20 09:49:25.955568859 +0000 UTC m=+0.040199482 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller) Feb 20 04:49:25 localhost podman[306316]: 2026-02-20 09:49:25.979992879 +0000 UTC m=+0.064623522 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:49:25 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:49:26 localhost podman[306317]: 2026-02-20 09:49:26.014756747 +0000 UTC m=+0.097077880 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:49:26 localhost podman[306317]: 2026-02-20 09:49:26.048041931 +0000 UTC m=+0.130363104 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:49:26 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:49:26 localhost openstack_network_exporter[244414]: ERROR 09:49:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:49:26 localhost openstack_network_exporter[244414]: Feb 20 04:49:26 localhost openstack_network_exporter[244414]: ERROR 09:49:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:49:26 localhost openstack_network_exporter[244414]: Feb 20 04:49:26 localhost nova_compute[281288]: 2026-02-20 09:49:26.665 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:26 localhost nova_compute[281288]: 2026-02-20 09:49:26.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:29 localhost sshd[306360]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:49:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:31 localhost nova_compute[281288]: 2026-02-20 09:49:31.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:49:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:49:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:49:34 localhost podman[306448]: 2026-02-20 09:49:34.149799018 +0000 UTC m=+0.083378901 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Feb 20 04:49:34 localhost podman[306448]: 2026-02-20 09:49:34.159290861 +0000 UTC m=+0.092870744 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:49:34 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:49:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:49:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:36 localhost nova_compute[281288]: 2026-02-20 09:49:36.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:41 localhost nova_compute[281288]: 2026-02-20 09:49:41.673 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:49:44 localhost podman[306466]: 2026-02-20 09:49:44.128207181 +0000 UTC m=+0.068968881 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:49:44 localhost podman[306466]: 2026-02-20 09:49:44.140074365 +0000 UTC m=+0.080836085 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:49:44 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:49:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:46 localhost nova_compute[281288]: 2026-02-20 09:49:46.677 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:49:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5069 writes, 22K keys, 5069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5069 writes, 696 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 90 writes, 307 keys, 90 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s#012Interval WAL: 90 writes, 39 syncs, 2.31 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:49:47 localhost podman[241968]: time="2026-02-20T09:49:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:49:47 localhost podman[241968]: @ - - [20/Feb/2026:09:49:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:49:47 localhost podman[241968]: @ - - [20/Feb/2026:09:49:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18294 "" "Go-http-client/1.1" Feb 20 04:49:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:49:50 localhost sshd[306500]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:49:50 localhost systemd[1]: tmp-crun.nXONmp.mount: Deactivated successfully. Feb 20 04:49:50 localhost podman[306488]: 2026-02-20 09:49:50.157723851 +0000 UTC m=+0.095918135 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:49:50 localhost podman[306488]: 2026-02-20 09:49:50.166216014 +0000 UTC m=+0.104410298 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:49:50 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:49:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:50 localhost nova_compute[281288]: 2026-02-20 09:49:50.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:49:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5997 writes, 25K keys, 5997 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5997 writes, 930 syncs, 6.45 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 177 writes, 364 keys, 177 commit groups, 1.0 writes per commit group, ingest: 0.34 MB, 0.00 MB/s#012Interval WAL: 177 writes, 85 syncs, 2.08 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:49:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:49:51 localhost podman[306514]: 2026-02-20 09:49:51.661624797 +0000 UTC m=+0.079676710 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 04:49:51 localhost podman[306514]: 2026-02-20 09:49:51.678190841 +0000 UTC m=+0.096242774 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9) Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.678 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.681 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.684 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:51 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.738 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.738 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.738 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.739 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:49:51 localhost nova_compute[281288]: 2026-02-20 09:49:51.739 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:49:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:49:52 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1972084818' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.229 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.315 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.315 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.515 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.517 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11775MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.518 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.518 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.585 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.586 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.586 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.627 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:49:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:49:52.965 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:49:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:49:52.965 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:49:52 localhost nova_compute[281288]: 2026-02-20 09:49:52.991 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:49:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1147410803' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:49:53 localhost nova_compute[281288]: 2026-02-20 09:49:53.087 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:49:53 localhost nova_compute[281288]: 2026-02-20 09:49:53.094 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:49:53 localhost nova_compute[281288]: 2026-02-20 09:49:53.110 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:49:53 localhost nova_compute[281288]: 2026-02-20 09:49:53.112 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:49:53 localhost nova_compute[281288]: 2026-02-20 09:49:53.113 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:49:54 localhost nova_compute[281288]: 2026-02-20 09:49:54.109 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:54 localhost nova_compute[281288]: 2026-02-20 09:49:54.110 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:54 localhost nova_compute[281288]: 2026-02-20 09:49:54.127 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:54 localhost nova_compute[281288]: 2026-02-20 09:49:54.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:54 localhost nova_compute[281288]: 2026-02-20 09:49:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:54 localhost nova_compute[281288]: 2026-02-20 09:49:54.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.793 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.794 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.794 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:49:55 localhost nova_compute[281288]: 2026-02-20 09:49:55.795 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:49:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:49:56 localhost podman[306578]: 2026-02-20 09:49:56.142683971 +0000 UTC m=+0.078572497 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:49:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:49:56 localhost podman[306578]: 2026-02-20 09:49:56.185028626 +0000 UTC m=+0.120917182 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3) Feb 20 04:49:56 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:49:56 localhost podman[306597]: 2026-02-20 09:49:56.243888913 +0000 UTC m=+0.082880616 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:49:56 localhost podman[306597]: 2026-02-20 09:49:56.274950881 +0000 UTC m=+0.113942554 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent) Feb 20 04:49:56 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:49:56 localhost nova_compute[281288]: 2026-02-20 09:49:56.337 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:49:56 localhost nova_compute[281288]: 2026-02-20 09:49:56.358 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:49:56 localhost nova_compute[281288]: 2026-02-20 09:49:56.358 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:49:56 localhost openstack_network_exporter[244414]: ERROR 09:49:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:49:56 localhost openstack_network_exporter[244414]: Feb 20 04:49:56 localhost openstack_network_exporter[244414]: ERROR 09:49:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:49:56 localhost openstack_network_exporter[244414]: Feb 20 04:49:56 localhost nova_compute[281288]: 2026-02-20 09:49:56.683 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:49:56 localhost nova_compute[281288]: 2026-02-20 09:49:56.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:49:56 localhost nova_compute[281288]: 2026-02-20 09:49:56.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:49:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:49:57.968 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:49:59 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e92 e92: 6 total, 6 up, 6 in Feb 20 04:50:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:00 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 04:50:01 localhost nova_compute[281288]: 2026-02-20 09:50:01.687 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:01 localhost nova_compute[281288]: 2026-02-20 09:50:01.689 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:01 localhost nova_compute[281288]: 2026-02-20 09:50:01.690 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:01 localhost nova_compute[281288]: 2026-02-20 09:50:01.690 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:01 localhost nova_compute[281288]: 2026-02-20 09:50:01.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:01 localhost nova_compute[281288]: 2026-02-20 09:50:01.723 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 e93: 6 total, 6 up, 6 in Feb 20 04:50:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:50:05 localhost systemd[1]: tmp-crun.N4ssJV.mount: Deactivated successfully. Feb 20 04:50:05 localhost podman[306619]: 2026-02-20 09:50:05.148520799 +0000 UTC m=+0.086633518 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:50:05 localhost podman[306619]: 2026-02-20 09:50:05.186209145 +0000 UTC m=+0.124321824 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 20 04:50:05 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:50:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:50:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:50:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:50:06 localhost sshd[306638]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:50:06 localhost nova_compute[281288]: 2026-02-20 09:50:06.724 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:11 localhost nova_compute[281288]: 2026-02-20 09:50:11.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:11 localhost nova_compute[281288]: 2026-02-20 09:50:11.730 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:11 localhost nova_compute[281288]: 2026-02-20 09:50:11.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:11 localhost nova_compute[281288]: 2026-02-20 09:50:11.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:11 localhost nova_compute[281288]: 2026-02-20 09:50:11.765 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:11 localhost nova_compute[281288]: 2026-02-20 09:50:11.766 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:14 localhost sshd[306640]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:50:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:50:14 localhost podman[306642]: 2026-02-20 09:50:14.620660085 +0000 UTC m=+0.084317499 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:50:14 localhost podman[306642]: 2026-02-20 09:50:14.633056765 +0000 UTC m=+0.096714209 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:50:14 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:50:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:16 localhost nova_compute[281288]: 2026-02-20 09:50:16.766 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:16 localhost nova_compute[281288]: 2026-02-20 09:50:16.768 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:16 localhost nova_compute[281288]: 2026-02-20 09:50:16.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:16 localhost nova_compute[281288]: 2026-02-20 09:50:16.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:16 localhost nova_compute[281288]: 2026-02-20 09:50:16.821 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:16 localhost nova_compute[281288]: 2026-02-20 09:50:16.822 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:17 localhost podman[241968]: time="2026-02-20T09:50:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:50:17 localhost podman[241968]: @ - - [20/Feb/2026:09:50:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:50:17 localhost podman[241968]: @ - - [20/Feb/2026:09:50:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18293 "" "Go-http-client/1.1" Feb 20 04:50:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:50:21 localhost podman[306665]: 2026-02-20 09:50:21.144881857 +0000 UTC m=+0.082306459 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:50:21 localhost podman[306665]: 2026-02-20 09:50:21.178948915 +0000 UTC m=+0.116373517 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:50:21 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:50:21 localhost nova_compute[281288]: 2026-02-20 09:50:21.823 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:21 localhost nova_compute[281288]: 2026-02-20 09:50:21.860 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:21 localhost nova_compute[281288]: 2026-02-20 09:50:21.860 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:21 localhost nova_compute[281288]: 2026-02-20 09:50:21.861 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:21 localhost nova_compute[281288]: 2026-02-20 09:50:21.862 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:21 localhost nova_compute[281288]: 2026-02-20 09:50:21.863 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:21 localhost sshd[306689]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:50:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:50:22 localhost podman[306691]: 2026-02-20 09:50:22.150409712 +0000 UTC m=+0.082452003 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter) Feb 20 04:50:22 localhost podman[306691]: 2026-02-20 09:50:22.193097067 +0000 UTC m=+0.125139358 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:50:22 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:50:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:26 localhost ovn_controller[156798]: 2026-02-20T09:50:26Z|00069|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory Feb 20 04:50:26 localhost openstack_network_exporter[244414]: ERROR 09:50:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:50:26 localhost openstack_network_exporter[244414]: Feb 20 04:50:26 localhost openstack_network_exporter[244414]: ERROR 09:50:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:50:26 localhost openstack_network_exporter[244414]: Feb 20 04:50:26 localhost nova_compute[281288]: 2026-02-20 09:50:26.897 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:26 localhost nova_compute[281288]: 2026-02-20 09:50:26.899 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:26 localhost nova_compute[281288]: 2026-02-20 09:50:26.899 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:26 localhost nova_compute[281288]: 2026-02-20 09:50:26.900 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:26 localhost nova_compute[281288]: 2026-02-20 09:50:26.901 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:26 localhost nova_compute[281288]: 2026-02-20 09:50:26.902 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:50:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:50:27 localhost podman[306711]: 2026-02-20 09:50:27.14798154 +0000 UTC m=+0.079908906 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:50:27 localhost systemd[1]: tmp-crun.G8NbrI.mount: Deactivated successfully. Feb 20 04:50:27 localhost podman[306712]: 2026-02-20 09:50:27.214505427 +0000 UTC m=+0.150381402 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:50:27 localhost podman[306712]: 2026-02-20 09:50:27.225045042 +0000 UTC m=+0.160921057 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:50:27 localhost sshd[306752]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:50:27 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:50:27 localhost podman[306711]: 2026-02-20 09:50:27.266396327 +0000 UTC m=+0.198323673 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:50:27 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:50:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:31 localhost nova_compute[281288]: 2026-02-20 09:50:31.902 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:31 localhost nova_compute[281288]: 2026-02-20 09:50:31.903 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:31 localhost nova_compute[281288]: 2026-02-20 09:50:31.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:31 localhost nova_compute[281288]: 2026-02-20 09:50:31.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:31 localhost nova_compute[281288]: 2026-02-20 09:50:31.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:31 localhost nova_compute[281288]: 2026-02-20 09:50:31.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:50:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:50:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:50:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:50:36 localhost systemd[1]: tmp-crun.mTJawC.mount: Deactivated successfully. Feb 20 04:50:36 localhost podman[306841]: 2026-02-20 09:50:36.17781094 +0000 UTC m=+0.102427269 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:50:36 localhost podman[306841]: 2026-02-20 09:50:36.192680724 +0000 UTC m=+0.117297043 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:50:36 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:50:36 localhost nova_compute[281288]: 2026-02-20 09:50:36.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:41 localhost nova_compute[281288]: 2026-02-20 09:50:41.909 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:41 localhost nova_compute[281288]: 2026-02-20 09:50:41.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:41 localhost nova_compute[281288]: 2026-02-20 09:50:41.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:41 localhost nova_compute[281288]: 2026-02-20 09:50:41.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:41 localhost nova_compute[281288]: 2026-02-20 09:50:41.950 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:41 localhost nova_compute[281288]: 2026-02-20 09:50:41.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:42 localhost nova_compute[281288]: 2026-02-20 09:50:42.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:42 localhost nova_compute[281288]: 2026-02-20 09:50:42.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 04:50:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:44.905 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpd2d1_9_k/privsep.sock']#033[00m Feb 20 04:50:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:50:45 localhost podman[306863]: 2026-02-20 09:50:45.18252836 +0000 UTC m=+0.122674365 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:50:45 localhost podman[306863]: 2026-02-20 09:50:45.188405205 +0000 UTC m=+0.128551170 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:50:45 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:50:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.528 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:50:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.418 306887 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:50:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.422 306887 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:50:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.425 306887 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 20 04:50:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.426 306887 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306887#033[00m Feb 20 04:50:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.069 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpy7d1rb82/privsep.sock']#033[00m Feb 20 04:50:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.672 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:50:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.563 306896 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:50:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.568 306896 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:50:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.571 306896 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 20 04:50:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.571 306896 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306896#033[00m Feb 20 04:50:46 localhost nova_compute[281288]: 2026-02-20 09:50:46.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:46 localhost nova_compute[281288]: 2026-02-20 09:50:46.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:50:46 localhost nova_compute[281288]: 2026-02-20 09:50:46.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:50:46 localhost nova_compute[281288]: 2026-02-20 09:50:46.955 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:46 localhost nova_compute[281288]: 2026-02-20 09:50:46.993 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:46 localhost nova_compute[281288]: 2026-02-20 09:50:46.994 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:50:47 localhost nova_compute[281288]: 2026-02-20 09:50:47.275 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:47 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:47.561 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpzojxzdwa/privsep.sock']#033[00m Feb 20 04:50:47 localhost podman[241968]: time="2026-02-20T09:50:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:50:47 localhost podman[241968]: @ - - [20/Feb/2026:09:50:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:50:47 localhost podman[241968]: @ - - [20/Feb/2026:09:50:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1" Feb 20 04:50:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.173 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 20 04:50:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.066 306908 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 20 04:50:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.071 306908 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 20 04:50:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.074 306908 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Feb 20 04:50:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.074 306908 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306908#033[00m Feb 20 04:50:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:49.551 264355 INFO neutron.agent.linux.ip_lib [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Device tap31575aad-f0 cannot be used as it has no MAC address#033[00m Feb 20 04:50:49 localhost nova_compute[281288]: 2026-02-20 09:50:49.627 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:49 localhost kernel: device tap31575aad-f0 entered promiscuous mode Feb 20 04:50:49 localhost NetworkManager[5988]: [1771581049.6405] manager: (tap31575aad-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/17) Feb 20 04:50:49 localhost ovn_controller[156798]: 2026-02-20T09:50:49Z|00070|binding|INFO|Claiming lport 31575aad-f0ba-4bf3-a41b-1370b560a95e for this chassis. Feb 20 04:50:49 localhost ovn_controller[156798]: 2026-02-20T09:50:49Z|00071|binding|INFO|31575aad-f0ba-4bf3-a41b-1370b560a95e: Claiming unknown Feb 20 04:50:49 localhost systemd-udevd[306923]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:50:49 localhost nova_compute[281288]: 2026-02-20 09:50:49.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:49.655 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a71c85068454599bd460f60dda32410', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=019bf044-9ed4-4b4f-84fd-b8df4eb6e270, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=31575aad-f0ba-4bf3-a41b-1370b560a95e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:50:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:49.657 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 31575aad-f0ba-4bf3-a41b-1370b560a95e in datapath f5fde837-7459-455b-ad78-388d579e00e0 bound to our chassis#033[00m Feb 20 04:50:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:49.660 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 971d15a3-6eaf-4bc0-b431-214a710c6d28 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:50:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:49.660 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5fde837-7459-455b-ad78-388d579e00e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:50:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:49.664 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b49cd48f-5e27-49c7-9483-d62b8f6582ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:50:49 localhost journal[229984]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, ) Feb 20 04:50:49 localhost journal[229984]: hostname: np0005625204.localdomain Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost ovn_controller[156798]: 2026-02-20T09:50:49Z|00072|binding|INFO|Setting lport 31575aad-f0ba-4bf3-a41b-1370b560a95e ovn-installed in OVS Feb 20 04:50:49 localhost ovn_controller[156798]: 2026-02-20T09:50:49Z|00073|binding|INFO|Setting lport 31575aad-f0ba-4bf3-a41b-1370b560a95e up in Southbound Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost nova_compute[281288]: 2026-02-20 09:50:49.675 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost journal[229984]: ethtool ioctl error on tap31575aad-f0: No such device Feb 20 04:50:49 localhost nova_compute[281288]: 2026-02-20 09:50:49.719 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:49 localhost nova_compute[281288]: 2026-02-20 09:50:49.749 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:49 localhost nova_compute[281288]: 2026-02-20 09:50:49.766 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:50 localhost podman[306996]: Feb 20 04:50:50 localhost podman[306996]: 2026-02-20 09:50:50.666400188 +0000 UTC m=+0.063523688 container create cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:50:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:50 localhost systemd[1]: Started libpod-conmon-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099.scope. Feb 20 04:50:50 localhost systemd[1]: Started libcrun container. Feb 20 04:50:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f6be432043f77aaa1f4774174218e0ed969c67419b29f51e1d1f5ed1490642/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:50:50 localhost podman[306996]: 2026-02-20 09:50:50.638701181 +0000 UTC m=+0.035824771 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:50:50 localhost podman[306996]: 2026-02-20 09:50:50.744112367 +0000 UTC m=+0.141235867 container init cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:50:50 localhost podman[306996]: 2026-02-20 09:50:50.758735705 +0000 UTC m=+0.155859245 container start cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:50:50 localhost dnsmasq[307014]: started, version 2.85 cachesize 150 Feb 20 04:50:50 localhost dnsmasq[307014]: DNS service limited to local subnets Feb 20 04:50:50 localhost dnsmasq[307014]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:50:50 localhost dnsmasq[307014]: warning: no upstream servers configured Feb 20 04:50:50 localhost dnsmasq-dhcp[307014]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:50:50 localhost dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 0 addresses Feb 20 04:50:50 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host Feb 20 04:50:50 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts Feb 20 04:50:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:50.817 264355 INFO neutron.agent.dhcp.agent [None req-6a378441-ebbf-4433-970c-af8645e50a93 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:48Z, description=, device_id=b5d0538d-0ab3-4d2a-a4dc-0d49c2ca7aa5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=97f20640-625d-4622-93e9-d2725d104851, ip_allocation=immediate, mac_address=fa:16:3e:78:b2:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:43Z, description=, dns_domain=, id=f5fde837-7459-455b-ad78-388d579e00e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1091425961-network, port_security_enabled=True, project_id=5a71c85068454599bd460f60dda32410, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46902, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=251, status=ACTIVE, subnets=['ca5de6a4-7fc3-43ac-b6e5-6653e73f1e70'], tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:44Z, vlan_transparent=None, network_id=f5fde837-7459-455b-ad78-388d579e00e0, port_security_enabled=False, project_id=5a71c85068454599bd460f60dda32410, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=320, status=DOWN, tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:48Z on network f5fde837-7459-455b-ad78-388d579e00e0#033[00m Feb 20 04:50:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:50.899 264355 INFO neutron.agent.dhcp.agent [None req-8ed23602-a787-4725-b8c8-510121196255 - - - - - -] DHCP configuration for ports {'8d76b0e0-7a09-42a5-96b9-2313d0cff9d6'} is completed#033[00m Feb 20 04:50:51 localhost dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 1 addresses Feb 20 04:50:51 localhost podman[307032]: 2026-02-20 09:50:51.034376745 +0000 UTC m=+0.066617940 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:50:51 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host Feb 20 04:50:51 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts Feb 20 04:50:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.158 264355 INFO neutron.agent.dhcp.agent [None req-ab33f320-70c7-48d3-9cdd-d05cd384fb6f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:48Z, description=, device_id=b5d0538d-0ab3-4d2a-a4dc-0d49c2ca7aa5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=97f20640-625d-4622-93e9-d2725d104851, ip_allocation=immediate, mac_address=fa:16:3e:78:b2:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:43Z, description=, dns_domain=, id=f5fde837-7459-455b-ad78-388d579e00e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1091425961-network, port_security_enabled=True, project_id=5a71c85068454599bd460f60dda32410, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46902, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=251, status=ACTIVE, subnets=['ca5de6a4-7fc3-43ac-b6e5-6653e73f1e70'], tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:44Z, vlan_transparent=None, network_id=f5fde837-7459-455b-ad78-388d579e00e0, port_security_enabled=False, project_id=5a71c85068454599bd460f60dda32410, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=320, status=DOWN, tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:48Z on network f5fde837-7459-455b-ad78-388d579e00e0#033[00m Feb 20 04:50:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.309 264355 INFO neutron.agent.dhcp.agent [None req-52186cb0-788d-4113-87a9-aa379704d383 - - - - - -] DHCP configuration for ports {'97f20640-625d-4622-93e9-d2725d104851'} is completed#033[00m Feb 20 04:50:51 localhost dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 1 addresses Feb 20 04:50:51 localhost podman[307069]: 2026-02-20 09:50:51.459081007 +0000 UTC m=+0.066258360 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:50:51 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host Feb 20 04:50:51 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts Feb 20 04:50:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:50:51 localhost podman[307087]: 2026-02-20 09:50:51.629533626 +0000 UTC m=+0.072606928 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:50:51 localhost podman[307087]: 2026-02-20 09:50:51.643164093 +0000 UTC m=+0.086237435 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:50:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.653 264355 INFO neutron.agent.dhcp.agent [None req-718f11f1-e233-41cd-a860-e1e2cf44efbe - - - - - -] DHCP configuration for ports {'97f20640-625d-4622-93e9-d2725d104851'} is completed#033[00m Feb 20 04:50:51 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.743 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:50:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.763 264355 INFO neutron.agent.linux.ip_lib [None req-bbba7d19-4bf4-4398-86f3-888da9f43e3e - - - - - -] Device tap9906e141-c3 cannot be used as it has no MAC address#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:51 localhost kernel: device tap9906e141-c3 entered promiscuous mode Feb 20 04:50:51 localhost NetworkManager[5988]: [1771581051.8364] manager: (tap9906e141-c3): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Feb 20 04:50:51 localhost systemd-udevd[306925]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:50:51 localhost ovn_controller[156798]: 2026-02-20T09:50:51Z|00074|binding|INFO|Claiming lport 9906e141-c388-453f-9169-7c98a351db5e for this chassis. Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.838 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:51 localhost ovn_controller[156798]: 2026-02-20T09:50:51Z|00075|binding|INFO|9906e141-c388-453f-9169-7c98a351db5e: Claiming unknown Feb 20 04:50:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:51.848 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9906e141-c388-453f-9169-7c98a351db5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:50:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:51.850 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 9906e141-c388-453f-9169-7c98a351db5e in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 bound to our chassis#033[00m Feb 20 04:50:51 localhost ovn_controller[156798]: 2026-02-20T09:50:51Z|00076|binding|INFO|Setting lport 9906e141-c388-453f-9169-7c98a351db5e ovn-installed in OVS Feb 20 04:50:51 localhost ovn_controller[156798]: 2026-02-20T09:50:51Z|00077|binding|INFO|Setting lport 9906e141-c388-453f-9169-7c98a351db5e up in Southbound Feb 20 04:50:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:51.852 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8d59c69c-3a69-449e-9d36-233c1f4c5c30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:50:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:51.852 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.853 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:51.853 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5169d689-d1d1-4c0b-be6f-01d9e1c84395]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.870 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.903 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.938 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:51 localhost nova_compute[281288]: 2026-02-20 09:50:51.995 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.228 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.291 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.312 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.312 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.560 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.562 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11475MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.563 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.563 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.654 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.655 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.655 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:50:52 localhost nova_compute[281288]: 2026-02-20 09:50:52.710 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:50:52 localhost podman[307201]: Feb 20 04:50:52 localhost podman[307201]: 2026-02-20 09:50:52.833101685 +0000 UTC m=+0.081561617 container create a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:50:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:50:52 localhost systemd[1]: Started libpod-conmon-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7.scope. Feb 20 04:50:52 localhost podman[307201]: 2026-02-20 09:50:52.784629827 +0000 UTC m=+0.033089849 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:50:52 localhost systemd[1]: Started libcrun container. Feb 20 04:50:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b7f60ded7c720c0eef64273aac4a3860430b8d407f2c1fb2fc8c0fb6a9fb560/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:50:52 localhost podman[307201]: 2026-02-20 09:50:52.907492036 +0000 UTC m=+0.155951998 container init a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:50:52 localhost podman[307201]: 2026-02-20 09:50:52.91801965 +0000 UTC m=+0.166479622 container start a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:50:52 localhost dnsmasq[307248]: started, version 2.85 cachesize 150 Feb 20 04:50:52 localhost dnsmasq[307248]: DNS service limited to local subnets Feb 20 04:50:52 localhost dnsmasq[307248]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:50:52 localhost dnsmasq[307248]: warning: no upstream servers configured Feb 20 04:50:52 localhost dnsmasq-dhcp[307248]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:50:52 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 0 addresses Feb 20 04:50:52 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:50:52 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:50:52 localhost podman[307233]: 2026-02-20 09:50:52.976563088 +0000 UTC m=+0.101051038 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.) Feb 20 04:50:52 localhost podman[307233]: 2026-02-20 09:50:52.990429132 +0000 UTC m=+0.114917102 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1770267347, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:50:53 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:50:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:53.147 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:50:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:53.153 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:50:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:53.162 264355 INFO neutron.agent.dhcp.agent [None req-c3ae247d-f4e6-4f75-a064-f1a1e271b441 - - - - - -] DHCP configuration for ports {'2b93bbc2-5aeb-49cc-b610-6f4f7708d346'} is completed#033[00m Feb 20 04:50:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:50:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1986532880' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.186 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.199 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.206 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.220 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.223 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.223 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:50:53 localhost nova_compute[281288]: 2026-02-20 09:50:53.224 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:54.201 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:53Z, description=, device_id=0c77eb17-66e6-4aa0-8b78-169b259339e9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=034bd422-60dc-4b3b-ba25-f380892caff4, ip_allocation=immediate, mac_address=fa:16:3e:14:5b:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:49Z, description=, dns_domain=, id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-982155183-network, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=331, status=ACTIVE, subnets=['c9423f67-342b-44f2-ac81-92ef706f7aa6'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:50Z, vlan_transparent=None, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=False, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=372, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:53Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0#033[00m Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.229 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.230 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.230 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:54 localhost podman[307278]: 2026-02-20 09:50:54.443320076 +0000 UTC m=+0.071507887 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:50:54 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 1 addresses Feb 20 04:50:54 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:50:54 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 04:50:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:54.726 264355 INFO neutron.agent.dhcp.agent [None req-693fa230-8a3b-4cc0-bf1a-d1ec6c5f9eaf - - - - - -] DHCP configuration for ports {'034bd422-60dc-4b3b-ba25-f380892caff4'} is completed#033[00m Feb 20 04:50:54 localhost nova_compute[281288]: 2026-02-20 09:50:54.741 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 04:50:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.740 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.741 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.844 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.845 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.845 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:50:55 localhost nova_compute[281288]: 2026-02-20 09:50:55.846 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:50:56 localhost dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 0 addresses Feb 20 04:50:56 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host Feb 20 04:50:56 localhost dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts Feb 20 04:50:56 localhost podman[307317]: 2026-02-20 09:50:56.095333726 +0000 UTC m=+0.059251181 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 20 04:50:56 localhost ovn_controller[156798]: 2026-02-20T09:50:56Z|00078|binding|INFO|Releasing lport 31575aad-f0ba-4bf3-a41b-1370b560a95e from this chassis (sb_readonly=0) Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:56 localhost kernel: device tap31575aad-f0 left promiscuous mode Feb 20 04:50:56 localhost ovn_controller[156798]: 2026-02-20T09:50:56Z|00079|binding|INFO|Setting lport 31575aad-f0ba-4bf3-a41b-1370b560a95e down in Southbound Feb 20 04:50:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:56.251 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a71c85068454599bd460f60dda32410', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=019bf044-9ed4-4b4f-84fd-b8df4eb6e270, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=31575aad-f0ba-4bf3-a41b-1370b560a95e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:50:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:56.254 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 31575aad-f0ba-4bf3-a41b-1370b560a95e in datapath f5fde837-7459-455b-ad78-388d579e00e0 unbound from our chassis#033[00m Feb 20 04:50:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:56.258 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5fde837-7459-455b-ad78-388d579e00e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.258 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:50:56.259 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd4409b-308a-4baf-bb7f-b1aa7c08f6cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.261 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:56.522 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:53Z, description=, device_id=0c77eb17-66e6-4aa0-8b78-169b259339e9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=034bd422-60dc-4b3b-ba25-f380892caff4, ip_allocation=immediate, mac_address=fa:16:3e:14:5b:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:49Z, description=, dns_domain=, id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-982155183-network, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=331, status=ACTIVE, subnets=['c9423f67-342b-44f2-ac81-92ef706f7aa6'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:50Z, vlan_transparent=None, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=False, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=372, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:53Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0#033[00m Feb 20 04:50:56 localhost openstack_network_exporter[244414]: ERROR 09:50:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:50:56 localhost openstack_network_exporter[244414]: Feb 20 04:50:56 localhost openstack_network_exporter[244414]: ERROR 09:50:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:50:56 localhost openstack_network_exporter[244414]: Feb 20 04:50:56 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 1 addresses Feb 20 04:50:56 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:50:56 localhost podman[307355]: 2026-02-20 09:50:56.813942002 +0000 UTC m=+0.062269170 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:50:56 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.838 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.921 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.941 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.941 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.942 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:56 localhost nova_compute[281288]: 2026-02-20 09:50:56.943 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:57 localhost nova_compute[281288]: 2026-02-20 09:50:57.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:57.118 264355 INFO neutron.agent.dhcp.agent [None req-4e874753-113f-4a89-9768-950317eae4fb - - - - - -] DHCP configuration for ports {'034bd422-60dc-4b3b-ba25-f380892caff4'} is completed#033[00m Feb 20 04:50:57 localhost sshd[307375]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:50:57 localhost ovn_controller[156798]: 2026-02-20T09:50:57Z|00080|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:50:58 localhost nova_compute[281288]: 2026-02-20 09:50:58.005 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:50:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:50:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:50:58 localhost podman[307378]: 2026-02-20 09:50:58.151378089 +0000 UTC m=+0.086078482 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:50:58 localhost podman[307378]: 2026-02-20 09:50:58.197098543 +0000 UTC m=+0.131798986 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 04:50:58 localhost podman[307377]: 2026-02-20 09:50:58.126843026 +0000 UTC m=+0.068167227 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:50:58 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:50:58 localhost podman[307377]: 2026-02-20 09:50:58.278311799 +0000 UTC m=+0.219635990 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 04:50:58 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:50:58 localhost dnsmasq[307014]: exiting on receipt of SIGTERM Feb 20 04:50:58 localhost podman[307434]: 2026-02-20 09:50:58.474759404 +0000 UTC m=+0.055206049 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:50:58 localhost systemd[1]: libpod-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099.scope: Deactivated successfully. Feb 20 04:50:58 localhost podman[307449]: 2026-02-20 09:50:58.542564149 +0000 UTC m=+0.057121276 container died cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:50:58 localhost podman[307449]: 2026-02-20 09:50:58.576168252 +0000 UTC m=+0.090725329 container cleanup cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:50:58 localhost systemd[1]: libpod-conmon-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099.scope: Deactivated successfully. Feb 20 04:50:58 localhost podman[307455]: 2026-02-20 09:50:58.620339141 +0000 UTC m=+0.124721315 container remove cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:50:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:58.651 264355 INFO neutron.agent.dhcp.agent [None req-f92b337a-9f33-4906-876a-909ecea07fd9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:50:58 localhost nova_compute[281288]: 2026-02-20 09:50:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:50:58 localhost nova_compute[281288]: 2026-02-20 09:50:58.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:50:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:50:58.925 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:50:59 localhost systemd[1]: var-lib-containers-storage-overlay-64f6be432043f77aaa1f4774174218e0ed969c67419b29f51e1d1f5ed1490642-merged.mount: Deactivated successfully. Feb 20 04:50:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099-userdata-shm.mount: Deactivated successfully. Feb 20 04:50:59 localhost systemd[1]: run-netns-qdhcp\x2df5fde837\x2d7459\x2d455b\x2dad78\x2d388d579e00e0.mount: Deactivated successfully. Feb 20 04:50:59 localhost sshd[307479]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:51:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:01 localhost nova_compute[281288]: 2026-02-20 09:51:01.496 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:02 localhost nova_compute[281288]: 2026-02-20 09:51:02.080 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:03 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:03.135 264355 INFO neutron.agent.linux.ip_lib [None req-a4c37026-8e84-4a7a-b634-a4c52fe77266 - - - - - -] Device tape1376599-c9 cannot be used as it has no MAC address#033[00m Feb 20 04:51:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:03.155 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:03 localhost nova_compute[281288]: 2026-02-20 09:51:03.201 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:03 localhost kernel: device tape1376599-c9 entered promiscuous mode Feb 20 04:51:03 localhost NetworkManager[5988]: [1771581063.2121] manager: (tape1376599-c9): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Feb 20 04:51:03 localhost ovn_controller[156798]: 2026-02-20T09:51:03Z|00081|binding|INFO|Claiming lport e1376599-c9f0-4546-a6b8-9a26e1215192 for this chassis. Feb 20 04:51:03 localhost ovn_controller[156798]: 2026-02-20T09:51:03Z|00082|binding|INFO|e1376599-c9f0-4546-a6b8-9a26e1215192: Claiming unknown Feb 20 04:51:03 localhost nova_compute[281288]: 2026-02-20 09:51:03.214 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:03 localhost systemd-udevd[307491]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:51:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:03.225 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ce7589beebc4b9187ac7a68f3264776', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b998e545-1dcc-4262-8de8-c6bf3daefa6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e1376599-c9f0-4546-a6b8-9a26e1215192) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:03.227 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e1376599-c9f0-4546-a6b8-9a26e1215192 in datapath c461e2c0-bc21-4786-8276-a80f7d59d18a bound to our chassis#033[00m Feb 20 04:51:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:03.229 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c461e2c0-bc21-4786-8276-a80f7d59d18a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:51:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:03.230 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[39f30aeb-f288-4c4b-b553-bc5304f5da49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost ovn_controller[156798]: 2026-02-20T09:51:03Z|00083|binding|INFO|Setting lport e1376599-c9f0-4546-a6b8-9a26e1215192 ovn-installed in OVS Feb 20 04:51:03 localhost ovn_controller[156798]: 2026-02-20T09:51:03Z|00084|binding|INFO|Setting lport e1376599-c9f0-4546-a6b8-9a26e1215192 up in Southbound Feb 20 04:51:03 localhost nova_compute[281288]: 2026-02-20 09:51:03.255 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost journal[229984]: ethtool ioctl error on tape1376599-c9: No such device Feb 20 04:51:03 localhost nova_compute[281288]: 2026-02-20 09:51:03.288 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:03 localhost nova_compute[281288]: 2026-02-20 09:51:03.317 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:04 localhost podman[307563]: Feb 20 04:51:04 localhost podman[307563]: 2026-02-20 09:51:04.107676781 +0000 UTC m=+0.085201236 container create 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:04 localhost systemd[1]: Started libpod-conmon-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3.scope. Feb 20 04:51:04 localhost systemd[1]: Started libcrun container. Feb 20 04:51:04 localhost podman[307563]: 2026-02-20 09:51:04.06749515 +0000 UTC m=+0.045019625 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:51:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/678080229dd1159fab2cef2bc14cfbf02bb404410bb244a2f4e58b96ed1ce8f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:51:04 localhost podman[307563]: 2026-02-20 09:51:04.177214977 +0000 UTC m=+0.154739432 container init 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true) Feb 20 04:51:04 localhost podman[307563]: 2026-02-20 09:51:04.185798894 +0000 UTC m=+0.163323349 container start 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:51:04 localhost dnsmasq[307581]: started, version 2.85 cachesize 150 Feb 20 04:51:04 localhost dnsmasq[307581]: DNS service limited to local subnets Feb 20 04:51:04 localhost dnsmasq[307581]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:51:04 localhost dnsmasq[307581]: warning: no upstream servers configured Feb 20 04:51:04 localhost dnsmasq-dhcp[307581]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:51:04 localhost dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 0 addresses Feb 20 04:51:04 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host Feb 20 04:51:04 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts Feb 20 04:51:04 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:04.309 264355 INFO neutron.agent.dhcp.agent [None req-6fbaa961-c7b3-4807-9c99-6827d93e37ef - - - - - -] DHCP configuration for ports {'5dfd33d6-db01-479a-af8d-bbb800d50548'} is completed#033[00m Feb 20 04:51:05 localhost nova_compute[281288]: 2026-02-20 09:51:05.439 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:05 localhost nova_compute[281288]: 2026-02-20 09:51:05.978 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:06.015 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:51:07 localhost nova_compute[281288]: 2026-02-20 09:51:07.130 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:07 localhost podman[307582]: 2026-02-20 09:51:07.190181873 +0000 UTC m=+0.128410516 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible) Feb 20 04:51:07 localhost podman[307582]: 2026-02-20 09:51:07.225987542 +0000 UTC m=+0.164216155 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:51:07 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:51:07 localhost sshd[307601]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:51:09 localhost nova_compute[281288]: 2026-02-20 09:51:09.930 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:10 localhost nova_compute[281288]: 2026-02-20 09:51:10.350 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:10 localhost nova_compute[281288]: 2026-02-20 09:51:10.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:11.694 2 INFO neutron.agent.securitygroups_rpc [None req-12bd9327-2dd3-43c4-b987-ac4cbf3c449a 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']#033[00m Feb 20 04:51:12 localhost nova_compute[281288]: 2026-02-20 09:51:12.161 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:12.198 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:11Z, description=, device_id=11869463-2b1a-4016-a65b-70d38a714c73, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8799e7fd-5c02-4d3b-a4f3-6cd59f823eec, ip_allocation=immediate, mac_address=fa:16:3e:06:57:ef, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:00Z, description=, dns_domain=, id=c461e2c0-bc21-4786-8276-a80f7d59d18a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-776197744-network, port_security_enabled=True, project_id=5ce7589beebc4b9187ac7a68f3264776, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31945, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=445, status=ACTIVE, subnets=['d96daeed-2eb7-4135-afcb-67a1e510422b'], tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:02Z, vlan_transparent=None, network_id=c461e2c0-bc21-4786-8276-a80f7d59d18a, port_security_enabled=False, project_id=5ce7589beebc4b9187ac7a68f3264776, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=505, status=DOWN, tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:11Z on network c461e2c0-bc21-4786-8276-a80f7d59d18a#033[00m Feb 20 04:51:12 localhost dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 1 addresses Feb 20 04:51:12 localhost podman[307620]: 2026-02-20 09:51:12.433441185 +0000 UTC m=+0.063390384 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:51:12 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host Feb 20 04:51:12 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts Feb 20 04:51:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:12.683 264355 INFO neutron.agent.dhcp.agent [None req-3b0739f5-8c16-4d34-9227-1e43e182fb2e - - - - - -] DHCP configuration for ports {'8799e7fd-5c02-4d3b-a4f3-6cd59f823eec'} is completed#033[00m Feb 20 04:51:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:13.679 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:11Z, description=, device_id=11869463-2b1a-4016-a65b-70d38a714c73, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8799e7fd-5c02-4d3b-a4f3-6cd59f823eec, ip_allocation=immediate, mac_address=fa:16:3e:06:57:ef, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:00Z, description=, dns_domain=, id=c461e2c0-bc21-4786-8276-a80f7d59d18a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-776197744-network, port_security_enabled=True, project_id=5ce7589beebc4b9187ac7a68f3264776, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31945, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=445, status=ACTIVE, subnets=['d96daeed-2eb7-4135-afcb-67a1e510422b'], tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:02Z, vlan_transparent=None, network_id=c461e2c0-bc21-4786-8276-a80f7d59d18a, port_security_enabled=False, project_id=5ce7589beebc4b9187ac7a68f3264776, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=505, status=DOWN, tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:11Z on network c461e2c0-bc21-4786-8276-a80f7d59d18a#033[00m Feb 20 04:51:13 localhost systemd[1]: tmp-crun.j09zt6.mount: Deactivated successfully. Feb 20 04:51:13 localhost podman[307658]: 2026-02-20 09:51:13.912250301 +0000 UTC m=+0.080443153 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:51:13 localhost dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 1 addresses Feb 20 04:51:13 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host Feb 20 04:51:13 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts Feb 20 04:51:14 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:14.136 264355 INFO neutron.agent.dhcp.agent [None req-3356d537-3256-4a3d-b5eb-608060d82911 - - - - - -] DHCP configuration for ports {'8799e7fd-5c02-4d3b-a4f3-6cd59f823eec'} is completed#033[00m Feb 20 04:51:14 localhost nova_compute[281288]: 2026-02-20 09:51:14.219 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e94 e94: 6 total, 6 up, 6 in Feb 20 04:51:15 localhost nova_compute[281288]: 2026-02-20 09:51:15.642 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:15 localhost ovn_controller[156798]: 2026-02-20T09:51:15Z|00085|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:51:15 localhost nova_compute[281288]: 2026-02-20 09:51:15.918 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:51:16 localhost systemd[1]: tmp-crun.7OWOPN.mount: Deactivated successfully. Feb 20 04:51:16 localhost podman[307680]: 2026-02-20 09:51:16.152603769 +0000 UTC m=+0.089238756 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:51:16 localhost podman[307680]: 2026-02-20 09:51:16.166076441 +0000 UTC m=+0.102711428 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:51:16 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:51:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:16.259 2 INFO neutron.agent.securitygroups_rpc [None req-dd3e0c14-3c22-4790-87f5-ba03a5ef1aea ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']#033[00m Feb 20 04:51:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:16.379 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:15Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=609a0699-8716-4bf8-9f50-bfeec5f65721, ip_allocation=immediate, mac_address=fa:16:3e:c0:a3:f9, name=tempest-parent-420346976, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:49Z, description=, dns_domain=, id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-982155183-network, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=331, status=ACTIVE, subnets=['c9423f67-342b-44f2-ac81-92ef706f7aa6'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:50Z, vlan_transparent=None, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=521, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:51:15Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0#033[00m Feb 20 04:51:16 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 2 addresses Feb 20 04:51:16 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:51:16 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:51:16 localhost podman[307722]: 2026-02-20 09:51:16.576584009 +0000 UTC m=+0.048712216 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:51:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:16.798 264355 INFO neutron.agent.dhcp.agent [None req-545fe686-f9ee-4026-ba90-55b31207d3e1 - - - - - -] DHCP configuration for ports {'609a0699-8716-4bf8-9f50-bfeec5f65721'} is completed#033[00m Feb 20 04:51:17 localhost nova_compute[281288]: 2026-02-20 09:51:17.199 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:17 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:17.441 2 INFO neutron.agent.securitygroups_rpc [None req-c36d1673-2dec-447b-a8b3-50030e0a0823 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']#033[00m Feb 20 04:51:17 localhost podman[241968]: time="2026-02-20T09:51:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:51:17 localhost podman[241968]: @ - - [20/Feb/2026:09:51:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 20 04:51:17 localhost podman[241968]: @ - - [20/Feb/2026:09:51:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19264 "" "Go-http-client/1.1" Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.208 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.238 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.239 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '256dada4-5f98-4b1c-9400-21beaabff7e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.209598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b418f33e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '87946d9f0f26cb5d65c0c4b737223a80bd5c528df60490d05314641979eb4219'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.209598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b41908ce-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '8341a97d95e34528afc5a026d6388e13c5364d3034fa81efd639aced69e866d1'}]}, 'timestamp': '2026-02-20 09:51:18.240090', '_unique_id': '709bf9dd951947b1b897c24e6bd4f9fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.246 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f7df49b-c707-4074-a173-1df2e06ede56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.243183', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b41a13e0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '411506a4f53bf167bda6d25bcf2598a14864a452893bdad871bc264e4d040150'}]}, 'timestamp': '2026-02-20 09:51:18.246950', '_unique_id': '735add1382994b9eb670cb71ed4bbe32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e08b957-6eca-478d-af56-64d777e8acc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.249741', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b41a9914-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '5cb09b3c4aae2f15e9c0fc07b4837ad6d81b30da9b6fee494a191ece0e11a855'}]}, 'timestamp': '2026-02-20 09:51:18.250356', '_unique_id': 'ea3b9d2d97094c8d8837df79453bd1b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f0fd24-8f7a-4367-ab7f-dc355dfc3849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.252996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b41b154c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '629e655059f19c753c2f4495ab8c07767f87b7f7be3c75e8643b77155d6e05e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.252996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b41b299c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '588558e4cc07b0649db69bf9a815c7471a05b0215f47e6577271526bb8c07de8'}]}, 'timestamp': '2026-02-20 09:51:18.254052', '_unique_id': '342eb19806df4fae9e0b834665495f87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45ef484a-765f-45e9-b7bd-0ea7319e0702', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.256757', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b41ba8b8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '9cda7638a540ca035193aec621f25f5005c7c4af0ad2e8da6a0b93ff483fb00e'}]}, 'timestamp': '2026-02-20 09:51:18.257304', '_unique_id': 'fb079755193c4a3592d3b1b043bfa425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.259 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d0275d9-b710-40de-b8c7-b80623e4e1d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:51:18.260146', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b41eaf5e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.515627067, 'message_signature': '979f2facea36fa7c892bdbdd621b230d0462f72a2f1cc6dcf24f69a4f6c46717'}]}, 'timestamp': '2026-02-20 09:51:18.277129', '_unique_id': '4b98327a71164f61b31753912dbcb57a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5df4177-c90f-472d-9e16-b59a373c0441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.332167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4288524-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '149a6cce4857b69294c88c71d6b7a069ff78e47e19de73f9ce12246c27c5b055'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.332167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4289780-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '93e271380e82c4e0e0cd5e476e3d94ebffcfb5766ba0cdb822ab1a5f25713dec'}]}, 'timestamp': '2026-02-20 09:51:18.342059', '_unique_id': 'eb3d3cc30efc40e3aa530297c67f73c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.344 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.344 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 16410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c30287a-212a-4a0b-b194-df603d9396a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16410000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:51:18.344552', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b4290c60-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.515627067, 'message_signature': '27fd54a1e9bcbd59d640640ef88c3f251097d0acbcfff78c4b62ba6890ec6b78'}]}, 'timestamp': '2026-02-20 09:51:18.345038', '_unique_id': '26d83cdf9ffd4cee9be8d993070062a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.347 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '200419c1-83b9-46ce-93c3-7894d4eb8b37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.347145', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b429701a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '18146b90d526b6cc6a5c16891abd7bfe2d658b0724896dfae9f1dea2cb840851'}]}, 'timestamp': '2026-02-20 09:51:18.347604', '_unique_id': '17c933aa86ff4527a7a5baee69a72a20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.349 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5a34f86-a4f4-4d4d-9d36-94b3eb236daa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.349691', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b429d3c0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '52f942a50a692f7f2336547fb226c1a144e860e69101062c88d59db6461756f7'}]}, 'timestamp': '2026-02-20 09:51:18.350180', '_unique_id': '266c9dbb83c149a081130aafa9affa2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dec1583d-7c13-473a-8eef-2ac59e575bdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.352264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42a37c0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '701219493d81052b50baf13165e857beec324ff49f62e28964ac7eeea2d44582'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.352264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42a490e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '73ba829f44fce98045a0543ac11c103666096e928f15ebe5eb9c17a23344164d'}]}, 'timestamp': '2026-02-20 09:51:18.353127', '_unique_id': 'c8020be45a7944dca0407ff8867de070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.355 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b88c5e6c-1f89-4546-8a09-bfea5aae7dc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.355313', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42aaef8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '07b157154b622adae74ea027a3ac25fea2110251864ebe325356fdde498f2e89'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.355313', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42ac032-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '0cc5d6619dcb379128b3369089d4479c44461a4812e9d5624571d71c9f041e0f'}]}, 'timestamp': '2026-02-20 09:51:18.356174', '_unique_id': 'e4e63f354a68473690cc47ed0de8ca3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.358 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.358 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.358 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5aa818c-2f4e-4e79-99ff-d4d91d7c3ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.358293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42b2414-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'dc6d14a5f518c2236f321fc53138388c3afd581857db2c5869d0e446191a7e72'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.358293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42b36a2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '6a246f6b79746c8977d21cad23cd8bdd13b9d13b43ee77ca2a52eb9a3a480a5b'}]}, 'timestamp': '2026-02-20 09:51:18.359209', '_unique_id': '18ac07b2f31b4efabd2f93429b182269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5156e247-11b7-4704-93f7-ba16cb7584ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.361339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42b9a48-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '95a35c9a3a9e73f35a765f1d147499a0ee7ea505c89af3993be03382bd66c1a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.361339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42bab78-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'd335b71cd81d9f584263c4ebff0e3ba8827c511c4f443ed160a6c151ecf98b4f'}]}, 'timestamp': '2026-02-20 09:51:18.362232', '_unique_id': 'ca0f32f813a747b7adec95f56e406e7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f958db2c-bded-426a-92be-e6dde503f416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.364407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42c1266-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '36057ef342ab2647ed3f61d8517ef47b0dbc90daee12ad75819303c64f28288c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.364407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42c23f0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': 'e38e3ae862e273e4a47620ac9552068999574ee7bbd113212d366ce0074b55d0'}]}, 'timestamp': '2026-02-20 09:51:18.365284', '_unique_id': '070e0a75ffb74c68a2c1c138bc7b80fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.367 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a74aa16a-ca3d-40a8-8e3f-6cfe83fa1aed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.367162', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42c7af8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': 'f8cc19db6b907fd5592c1a257e76a93eafe4d9902408d810d41347f7912140ba'}]}, 'timestamp': '2026-02-20 09:51:18.367455', '_unique_id': '18da39b7229a4d4da949c11962118826'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd667267-2e09-4a23-8801-6ee78c81a17a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.368824', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42cbbbc-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '969c741fad99bfa85917933a41da62a85c31ff7fd92a89f8136dbf527f374d72'}]}, 'timestamp': '2026-02-20 09:51:18.369112', '_unique_id': '63e5d7964b2346fb84beab66f97afeca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.370 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b0c1b8f-6dca-42ff-b9f3-e2a331c5854c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.370408', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42cf974-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '91423644838db6c51b5df22716d8a3618b5e0351f410aa9aabc25da0b7b1f995'}]}, 'timestamp': '2026-02-20 09:51:18.370711', '_unique_id': 'c8b519a4f3204b08b8fbb0a635f618b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ff8c79e-095b-4d1c-abf5-fc227cbc37e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.372061', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42d39e8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': 'dd778da10f5fd1fe1c1954adf425aea1a36f66b86c7f6f1bb1285315e0c6f6f2'}]}, 'timestamp': '2026-02-20 09:51:18.372341', '_unique_id': 'f56e96d1d6004ca8b9fca226d7a1d1e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.373 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfd08a11-8fbe-43f1-9824-4287a61a5628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.373594', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42d76c4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': 'ef4fa13c92ca2f6552e3ac8291cdcc3233c5d022fe531389ce0f8369853d3cc3'}]}, 'timestamp': '2026-02-20 09:51:18.373899', '_unique_id': '80c32f01b81946aea392765742c01c23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.375 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.375 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.375 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b1cb260-fa20-4fe4-b01c-fc24c3a5014e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.375161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42db2ec-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'a8a6942dba88642a7fc121932b879ef8b3295b14d1114ed4286408256de38eee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.375161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42dbcba-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'db3547d5673f97c7f1b7d1d03b54be64b80d99a331865769fd827469a17d4b09'}]}, 'timestamp': '2026-02-20 09:51:18.375695', '_unique_id': 'fa8ca3e1ca774a888a9b2c2f3de747bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:51:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Feb 20 04:51:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e95 e95: 6 total, 6 up, 6 in Feb 20 04:51:18 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:18.766 2 INFO neutron.agent.securitygroups_rpc [req-3c77ea9c-030b-4c3f-a6b2-e9f761f0d591 req-ae9f50d3-4bb2-48d4-a279-bccc17ebbc38 19c6a0af0d664b5d92fdce6a6ecdbcc4 5ce7589beebc4b9187ac7a68f3264776 - - default default] Security group rule updated ['ddf49fd2-9d36-4d8c-9b90-f70fbafa6560']#033[00m Feb 20 04:51:19 localhost nova_compute[281288]: 2026-02-20 09:51:19.300 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:19.360 2 INFO neutron.agent.securitygroups_rpc [req-cf23cb9e-603b-4426-8e72-b88eccda31be req-4c57a8df-f22b-4186-8db6-2e7fa9aa1e7d 19c6a0af0d664b5d92fdce6a6ecdbcc4 5ce7589beebc4b9187ac7a68f3264776 - - default default] Security group rule updated ['ddf49fd2-9d36-4d8c-9b90-f70fbafa6560']#033[00m Feb 20 04:51:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:19.903 2 INFO neutron.agent.securitygroups_rpc [None req-b04d749d-19a2-4f89-bafc-552dc6778fc9 ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']#033[00m Feb 20 04:51:20 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:20.421 264355 INFO neutron.agent.linux.ip_lib [None req-d50820b1-4b96-4482-8fee-3816155b48ac - - - - - -] Device tap22bf7523-8a cannot be used as it has no MAC address#033[00m Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost kernel: device tap22bf7523-8a entered promiscuous mode Feb 20 04:51:20 localhost NetworkManager[5988]: [1771581080.4495] manager: (tap22bf7523-8a): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Feb 20 04:51:20 localhost ovn_controller[156798]: 2026-02-20T09:51:20Z|00086|binding|INFO|Claiming lport 22bf7523-8a19-46b0-a0b7-53070ea1823e for this chassis. Feb 20 04:51:20 localhost ovn_controller[156798]: 2026-02-20T09:51:20Z|00087|binding|INFO|22bf7523-8a19-46b0-a0b7-53070ea1823e: Claiming unknown Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.452 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost systemd-udevd[307753]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.459 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:20.468 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=22bf7523-8a19-46b0-a0b7-53070ea1823e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:20.470 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 22bf7523-8a19-46b0-a0b7-53070ea1823e in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc bound to our chassis#033[00m Feb 20 04:51:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:20.474 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc126e7a-67b5-4025-9da6-7c8301672033 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:51:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:20.474 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9021dc49-7e01-42e7-8f32-572dec89afcc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:20.475 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[96bec316-613d-416d-8fc8-cdcd0a3f32b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost ovn_controller[156798]: 2026-02-20T09:51:20Z|00088|binding|INFO|Setting lport 22bf7523-8a19-46b0-a0b7-53070ea1823e ovn-installed in OVS Feb 20 04:51:20 localhost ovn_controller[156798]: 2026-02-20T09:51:20Z|00089|binding|INFO|Setting lport 22bf7523-8a19-46b0-a0b7-53070ea1823e up in Southbound Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.491 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.514 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost journal[229984]: ethtool ioctl error on tap22bf7523-8a: No such device Feb 20 04:51:20 localhost nova_compute[281288]: 2026-02-20 09:51:20.539 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e96 e96: 6 total, 6 up, 6 in Feb 20 04:51:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:21 localhost podman[307825]: Feb 20 04:51:21 localhost podman[307825]: 2026-02-20 09:51:21.450463321 +0000 UTC m=+0.082415821 container create d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:51:21 localhost systemd[1]: Started libpod-conmon-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635.scope. Feb 20 04:51:21 localhost systemd[1]: Started libcrun container. Feb 20 04:51:21 localhost podman[307825]: 2026-02-20 09:51:21.411924351 +0000 UTC m=+0.043876861 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:51:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/faa9e2373e9edef47d940be28280fa18e93a1d836a7561ac2b42ed8a739e240e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:51:21 localhost podman[307825]: 2026-02-20 09:51:21.523756991 +0000 UTC m=+0.155709481 container init d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 20 04:51:21 localhost podman[307825]: 2026-02-20 09:51:21.531691017 +0000 UTC m=+0.163643517 container start d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:51:21 localhost dnsmasq[307844]: started, version 2.85 cachesize 150 Feb 20 04:51:21 localhost dnsmasq[307844]: DNS service limited to local subnets Feb 20 04:51:21 localhost dnsmasq[307844]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:51:21 localhost dnsmasq[307844]: warning: no upstream servers configured Feb 20 04:51:21 localhost dnsmasq-dhcp[307844]: DHCP, static leases only on 19.80.0.0, lease time 1d Feb 20 04:51:21 localhost dnsmasq[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/addn_hosts - 0 addresses Feb 20 04:51:21 localhost dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/host Feb 20 04:51:21 localhost dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/opts Feb 20 04:51:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:21.595 264355 INFO neutron.agent.dhcp.agent [None req-6470ccaf-2d7c-4ace-b3df-cc75451ce300 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:19Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ce4822a0-5e7a-4c40-9856-6c8879a12ac7, ip_allocation=immediate, mac_address=fa:16:3e:ef:22:88, name=tempest-subport-288633192, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:17Z, description=, dns_domain=, id=9021dc49-7e01-42e7-8f32-572dec89afcc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-1209378868, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36362, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=526, status=ACTIVE, subnets=['dd9ea435-5cb2-4df9-b036-81064a982eb1'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:51:18Z, vlan_transparent=None, network_id=9021dc49-7e01-42e7-8f32-572dec89afcc, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=536, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:51:19Z on network 9021dc49-7e01-42e7-8f32-572dec89afcc#033[00m Feb 20 04:51:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:21.689 264355 INFO neutron.agent.dhcp.agent [None req-014ad46e-8117-4079-a33a-14af686b614c - - - - - -] DHCP configuration for ports {'8069ffae-e153-4a3e-ac83-1cd290da58a3'} is completed#033[00m Feb 20 04:51:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e97 e97: 6 total, 6 up, 6 in Feb 20 04:51:21 localhost dnsmasq[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/addn_hosts - 1 addresses Feb 20 04:51:21 localhost dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/host Feb 20 04:51:21 localhost dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/opts Feb 20 04:51:21 localhost podman[307860]: 2026-02-20 09:51:21.834247991 +0000 UTC m=+0.060672502 container kill d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:51:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:51:22 localhost podman[307882]: 2026-02-20 09:51:22.143441644 +0000 UTC m=+0.079967349 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:51:22 localhost podman[307882]: 2026-02-20 09:51:22.17712617 +0000 UTC m=+0.113651865 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:51:22 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:51:22 localhost nova_compute[281288]: 2026-02-20 09:51:22.234 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:22 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:22.550 264355 INFO neutron.agent.dhcp.agent [None req-0d61c553-c9e7-4395-903b-ef9b589e58d8 - - - - - -] DHCP configuration for ports {'ce4822a0-5e7a-4c40-9856-6c8879a12ac7'} is completed#033[00m Feb 20 04:51:22 localhost nova_compute[281288]: 2026-02-20 09:51:22.905 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:22 localhost nova_compute[281288]: 2026-02-20 09:51:22.906 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:22 localhost nova_compute[281288]: 2026-02-20 09:51:22.934 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Feb 20 04:51:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.137 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.138 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.143 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.144 281292 INFO nova.compute.claims [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Claim successful on node np0005625204.localdomain#033[00m Feb 20 04:51:23 localhost podman[307905]: 2026-02-20 09:51:23.161816862 +0000 UTC m=+0.093304066 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:51:23 localhost podman[307905]: 2026-02-20 09:51:23.208438805 +0000 UTC m=+0.139925989 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1770267347, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9/ubi-minimal) Feb 20 04:51:23 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.465 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.486 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.487 281292 DEBUG nova.compute.provider_tree [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.549 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.586 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:51:23 localhost nova_compute[281288]: 2026-02-20 09:51:23.647 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:24 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:51:24 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2724867462' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.122 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.129 281292 DEBUG nova.compute.provider_tree [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.149 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.184 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.185 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.237 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.238 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.257 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.282 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.399 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.400 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.401 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Creating image(s)#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.438 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.475 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.516 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.521 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "3692da63af034f7d594aac7c4b8eda10742f09b0" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.522 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "3692da63af034f7d594aac7c4b8eda10742f09b0" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.544 281292 WARNING oslo_policy.policy [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.544 281292 WARNING oslo_policy.policy [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.546 281292 DEBUG nova.policy [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ba15d0e9919d4594a2e6e9d6b3414a5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m Feb 20 04:51:24 localhost nova_compute[281288]: 2026-02-20 09:51:24.604 281292 DEBUG nova.virt.libvirt.imagebackend [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Image locations are: [{'url': 'rbd://a8557ee9-b55d-5519-942c-cf8f6172f1d8/images/06bd71fd-c415-45d9-b669-46209b7ca2f4/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://a8557ee9-b55d-5519-942c-cf8f6172f1d8/images/06bd71fd-c415-45d9-b669-46209b7ca2f4/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Feb 20 04:51:24 localhost ovn_controller[156798]: 2026-02-20T09:51:24Z|00090|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.090 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:25 localhost systemd[1]: tmp-crun.LHJAEf.mount: Deactivated successfully. Feb 20 04:51:25 localhost dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 0 addresses Feb 20 04:51:25 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host Feb 20 04:51:25 localhost dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts Feb 20 04:51:25 localhost podman[308018]: 2026-02-20 09:51:25.357563327 +0000 UTC m=+0.078545736 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:25 localhost ovn_controller[156798]: 2026-02-20T09:51:25Z|00091|binding|INFO|Releasing lport e1376599-c9f0-4546-a6b8-9a26e1215192 from this chassis (sb_readonly=0) Feb 20 04:51:25 localhost ovn_controller[156798]: 2026-02-20T09:51:25Z|00092|binding|INFO|Setting lport e1376599-c9f0-4546-a6b8-9a26e1215192 down in Southbound Feb 20 04:51:25 localhost kernel: device tape1376599-c9 left promiscuous mode Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.549 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:25.552 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ce7589beebc4b9187ac7a68f3264776', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b998e545-1dcc-4262-8de8-c6bf3daefa6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e1376599-c9f0-4546-a6b8-9a26e1215192) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.552 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:25.556 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e1376599-c9f0-4546-a6b8-9a26e1215192 in datapath c461e2c0-bc21-4786-8276-a80f7d59d18a unbound from our chassis#033[00m Feb 20 04:51:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:25.560 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c461e2c0-bc21-4786-8276-a80f7d59d18a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:25.561 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[04b51bf8-61bb-4b6e-afef-236c35dece96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.638 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.728 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.730 281292 DEBUG nova.virt.images [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] 06bd71fd-c415-45d9-b669-46209b7ca2f4 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.731 281292 DEBUG nova.privsep.utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.732 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e98 e98: 6 total, 6 up, 6 in Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.975 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted" returned: 0 in 0.243s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:25 localhost nova_compute[281288]: 2026-02-20 09:51:25.980 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:26 localhost nova_compute[281288]: 2026-02-20 09:51:26.053 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:26 localhost nova_compute[281288]: 2026-02-20 09:51:26.055 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "3692da63af034f7d594aac7c4b8eda10742f09b0" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:26 localhost nova_compute[281288]: 2026-02-20 09:51:26.093 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:26 localhost nova_compute[281288]: 2026-02-20 09:51:26.099 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:26 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:26.476 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005625204.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:15Z, description=, device_id=90eb8d1f-8d13-4395-9d15-67fdaa60632d, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=tempest-livemigrationtest-server-721665546, extra_dhcp_opts=[], fixed_ips=[], id=609a0699-8716-4bf8-9f50-bfeec5f65721, ip_allocation=immediate, mac_address=fa:16:3e:c0:a3:f9, name=tempest-parent-420346976, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=521, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, trunk_details=sub_ports=[], trunk_id=bb723cd7-ac34-46b0-bf66-79c7ed1fe96f, updated_at=2026-02-20T09:51:25Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0#033[00m Feb 20 04:51:26 localhost openstack_network_exporter[244414]: ERROR 09:51:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:51:26 localhost openstack_network_exporter[244414]: Feb 20 04:51:26 localhost openstack_network_exporter[244414]: ERROR 09:51:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:51:26 localhost openstack_network_exporter[244414]: Feb 20 04:51:26 localhost nova_compute[281288]: 2026-02-20 09:51:26.704 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:26 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 2 addresses Feb 20 04:51:26 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:51:26 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:51:26 localhost podman[308111]: 2026-02-20 09:51:26.714447414 +0000 UTC m=+0.066008582 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:51:26 localhost nova_compute[281288]: 2026-02-20 09:51:26.836 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] resizing rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m Feb 20 04:51:26 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:26.923 264355 INFO neutron.agent.dhcp.agent [None req-2127b375-9167-4379-8284-d0a570c4be85 - - - - - -] DHCP configuration for ports {'609a0699-8716-4bf8-9f50-bfeec5f65721'} is completed#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.024 281292 DEBUG nova.objects.instance [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lazy-loading 'migration_context' on Instance uuid 90eb8d1f-8d13-4395-9d15-67fdaa60632d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.038 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.039 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Ensure instance console log exists: /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.039 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.040 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.041 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:27 localhost ovn_controller[156798]: 2026-02-20T09:51:27Z|00093|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.236 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.239 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:27 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e99 e99: 6 total, 6 up, 6 in Feb 20 04:51:27 localhost dnsmasq[307581]: exiting on receipt of SIGTERM Feb 20 04:51:27 localhost podman[308222]: 2026-02-20 09:51:27.583953147 +0000 UTC m=+0.061955261 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:51:27 localhost systemd[1]: libpod-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3.scope: Deactivated successfully. Feb 20 04:51:27 localhost podman[308236]: 2026-02-20 09:51:27.659586145 +0000 UTC m=+0.059891869 container died 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:51:27 localhost systemd[1]: var-lib-containers-storage-overlay-678080229dd1159fab2cef2bc14cfbf02bb404410bb244a2f4e58b96ed1ce8f7-merged.mount: Deactivated successfully. Feb 20 04:51:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3-userdata-shm.mount: Deactivated successfully. Feb 20 04:51:27 localhost podman[308236]: 2026-02-20 09:51:27.753645683 +0000 UTC m=+0.153951367 container cleanup 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:51:27 localhost systemd[1]: libpod-conmon-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3.scope: Deactivated successfully. Feb 20 04:51:27 localhost podman[308238]: 2026-02-20 09:51:27.780793844 +0000 UTC m=+0.172926865 container remove 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.799 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Successfully updated port: 609a0699-8716-4bf8-9f50-bfeec5f65721 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m Feb 20 04:51:27 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:27.810 264355 INFO neutron.agent.dhcp.agent [None req-79c30ef8-3ce6-4ac2-99b9-583d7fae01d1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:51:27 localhost systemd[1]: run-netns-qdhcp\x2dc461e2c0\x2dbc21\x2d4786\x2d8276\x2da80f7d59d18a.mount: Deactivated successfully. Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.823 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.824 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquired lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.824 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Feb 20 04:51:27 localhost nova_compute[281288]: 2026-02-20 09:51:27.906 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Feb 20 04:51:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:28.039 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.362 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating instance_info_cache with network_info: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.398 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Releasing lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.399 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance network_info: |[{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.404 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start _get_guest_xml network_info=[{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-20T09:49:57Z,direct_url=,disk_format='qcow2',id=06bd71fd-c415-45d9-b669-46209b7ca2f4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='91bce661d685472eb3e7cacab17bf52a',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2026-02-20T09:49:59Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_options': None, 'image_id': '06bd71fd-c415-45d9-b669-46209b7ca2f4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.411 281292 WARNING nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.414 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.415 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.417 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.418 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.418 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.419 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-20T09:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='40a6f41a-8891-4900-942e-688a656af142',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-20T09:49:57Z,direct_url=,disk_format='qcow2',id=06bd71fd-c415-45d9-b669-46209b7ca2f4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='91bce661d685472eb3e7cacab17bf52a',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2026-02-20T09:49:59Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.420 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.420 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.420 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.421 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.421 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.422 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.422 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.423 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.423 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.424 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.429 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.594 281292 DEBUG nova.compute.manager [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.595 281292 DEBUG nova.compute.manager [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing instance network info cache due to event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.596 281292 DEBUG oslo_concurrency.lockutils [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.597 281292 DEBUG oslo_concurrency.lockutils [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquired lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.597 281292 DEBUG nova.network.neutron [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Feb 20 04:51:28 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e100 e100: 6 total, 6 up, 6 in Feb 20 04:51:28 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 20 04:51:28 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3117282067' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 20 04:51:28 localhost nova_compute[281288]: 2026-02-20 09:51:28.937 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.047 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:51:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.064 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:29 localhost podman[308305]: 2026-02-20 09:51:29.162474941 +0000 UTC m=+0.087445342 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:51:29 localhost podman[308304]: 2026-02-20 09:51:29.188371045 +0000 UTC m=+0.111530322 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.221 281292 DEBUG nova.network.neutron [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updated VIF entry in instance network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.222 281292 DEBUG nova.network.neutron [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating instance_info_cache with network_info: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.240 281292 DEBUG oslo_concurrency.lockutils [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Releasing lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:51:29 localhost podman[308304]: 2026-02-20 09:51:29.244742068 +0000 UTC m=+0.167901325 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible) Feb 20 04:51:29 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:51:29 localhost podman[308305]: 2026-02-20 09:51:29.29804834 +0000 UTC m=+0.223018701 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:51:29 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:51:29 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 20 04:51:29 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3784175978' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.609 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.611 281292 DEBUG nova.virt.libvirt.vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-20T09:51:24Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.612 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.613 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.616 281292 DEBUG nova.objects.instance [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 90eb8d1f-8d13-4395-9d15-67fdaa60632d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.635 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] End _get_guest_xml xml= Feb 20 04:51:29 localhost nova_compute[281288]: 90eb8d1f-8d13-4395-9d15-67fdaa60632d Feb 20 04:51:29 localhost nova_compute[281288]: instance-00000008 Feb 20 04:51:29 localhost nova_compute[281288]: 131072 Feb 20 04:51:29 localhost nova_compute[281288]: 1 Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: tempest-LiveMigrationTest-server-721665546 Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:28 Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: 128 Feb 20 04:51:29 localhost nova_compute[281288]: 1 Feb 20 04:51:29 localhost nova_compute[281288]: 0 Feb 20 04:51:29 localhost nova_compute[281288]: 0 Feb 20 04:51:29 localhost nova_compute[281288]: 1 Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: tempest-LiveMigrationTest-2108133970-project-member Feb 20 04:51:29 localhost nova_compute[281288]: tempest-LiveMigrationTest-2108133970 Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: RDO Feb 20 04:51:29 localhost nova_compute[281288]: OpenStack Compute Feb 20 04:51:29 localhost nova_compute[281288]: 27.5.2-0.20260127144738.eaa65f0.el9 Feb 20 04:51:29 localhost nova_compute[281288]: 90eb8d1f-8d13-4395-9d15-67fdaa60632d Feb 20 04:51:29 localhost nova_compute[281288]: 90eb8d1f-8d13-4395-9d15-67fdaa60632d Feb 20 04:51:29 localhost nova_compute[281288]: Virtual Machine Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: hvm Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: /dev/urandom Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: Feb 20 04:51:29 localhost nova_compute[281288]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.637 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Preparing to wait for external event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.637 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.638 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.638 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.640 281292 DEBUG nova.virt.libvirt.vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-20T09:51:24Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.640 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.641 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.642 281292 DEBUG os_vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.644 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.645 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.649 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.650 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap609a0699-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.650 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap609a0699-87, col_values=(('external_ids', {'iface-id': '609a0699-8716-4bf8-9f50-bfeec5f65721', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:a3:f9', 'vm-uuid': '90eb8d1f-8d13-4395-9d15-67fdaa60632d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.697 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.701 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.707 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.708 281292 INFO os_vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87')#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.778 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.779 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.779 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] No VIF found with MAC fa:16:3e:c0:a3:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.780 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Using config drive#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.820 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.948 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Creating config drive at /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config#033[00m Feb 20 04:51:29 localhost nova_compute[281288]: 2026-02-20 09:51:29.954 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphx732sd1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.078 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphx732sd1" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.119 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.125 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.347 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.349 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Deleting local config drive /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config because it was imported into RBD.#033[00m Feb 20 04:51:30 localhost systemd[1]: Started libvirt secret daemon. Feb 20 04:51:30 localhost kernel: device tap609a0699-87 entered promiscuous mode Feb 20 04:51:30 localhost NetworkManager[5988]: [1771581090.4548] manager: (tap609a0699-87): new Tun device (/org/freedesktop/NetworkManager/Devices/21) Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00094|binding|INFO|Claiming lport 609a0699-8716-4bf8-9f50-bfeec5f65721 for this chassis. Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00095|binding|INFO|609a0699-8716-4bf8-9f50-bfeec5f65721: Claiming fa:16:3e:c0:a3:f9 10.100.0.12 Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00096|binding|INFO|Claiming lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 for this chassis. Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00097|binding|INFO|ce4822a0-5e7a-4c40-9856-6c8879a12ac7: Claiming fa:16:3e:ef:22:88 19.80.0.55 Feb 20 04:51:30 localhost systemd-udevd[308456]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.459 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.474 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:a3:f9 10.100.0.12'], port_security=['fa:16:3e:c0:a3:f9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-420346976', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '90eb8d1f-8d13-4395-9d15-67fdaa60632d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-420346976', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=609a0699-8716-4bf8-9f50-bfeec5f65721) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.477 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:22:88 19.80.0.55'], port_security=['fa:16:3e:ef:22:88 19.80.0.55'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['609a0699-8716-4bf8-9f50-bfeec5f65721'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-288633192', 'neutron:cidrs': '19.80.0.55/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-288633192', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ce4822a0-5e7a-4c40-9856-6c8879a12ac7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:30 localhost NetworkManager[5988]: [1771581090.4787] device (tap609a0699-87): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00098|binding|INFO|Setting lport 609a0699-8716-4bf8-9f50-bfeec5f65721 up in Southbound Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00099|binding|INFO|Setting lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 up in Southbound Feb 20 04:51:30 localhost NetworkManager[5988]: [1771581090.4823] device (tap609a0699-87): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.478 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 609a0699-8716-4bf8-9f50-bfeec5f65721 in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 bound to our chassis#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.482 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8d59c69c-3a69-449e-9d36-233c1f4c5c30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.483 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.488 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00100|binding|INFO|Setting lport 609a0699-8716-4bf8-9f50-bfeec5f65721 ovn-installed in OVS Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.494 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7a85a5-ace3-40f6-96c6-14fc309d89d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.495 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap51f8ae9c-11 in ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.497 162782 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap51f8ae9c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.497 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[39a2bf12-de53-4c37-ba0e-3cf62e5b950e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.498 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[020afea6-67c4-45b0-bac2-7639e698d58b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.508 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9cc99d-ed10-4859-a9bd-745aba2d1fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost systemd-machined[85698]: New machine qemu-3-instance-00000008. Feb 20 04:51:30 localhost systemd[1]: Started Virtual Machine qemu-3-instance-00000008. Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.532 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[395f8974-e03e-4b31-9f3f-e542edecb4e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.566 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[fdcee394-a1ec-46b7-a5d4-c7d2b1b06dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost NetworkManager[5988]: [1771581090.5746] manager: (tap51f8ae9c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/22) Feb 20 04:51:30 localhost systemd-udevd[308459]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.573 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[127b08a2-a7dc-4e30-aba1-6cca0897327a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.612 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[b18b3c21-cf27-4236-8845-7c5e3bb2c50e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.617 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[909a552b-febb-4d02-8cf7-f9e128e0e395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap51f8ae9c-11: link becomes ready Feb 20 04:51:30 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap51f8ae9c-10: link becomes ready Feb 20 04:51:30 localhost NetworkManager[5988]: [1771581090.6464] device (tap51f8ae9c-10): carrier: link connected Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.652 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcab3e8-d39b-423f-a524-96896c273395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.669 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[86a7956b-8d47-4a89-a950-61b88ebf4fb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51f8ae9c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:63:f7:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164981, 'reachable_time': 34961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308494, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.684 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[723908b4-38b3-4b96-8f11-b3090e4be4cc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:f7d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1164981, 'tstamp': 1164981}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308502, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.697 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b96ee882-6b86-401e-b9ac-7c816b6c7fad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51f8ae9c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:63:f7:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164981, 'reachable_time': 34961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308511, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.730 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[de51d8e0-fdf6-4188-9307-ebc563c8159a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.793 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf065d6-5b6d-4fc7-ac70-eb5ace91b160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.794 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51f8ae9c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.794 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.795 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51f8ae9c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost kernel: device tap51f8ae9c-10 entered promiscuous mode Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.832 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.834 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap51f8ae9c-10, col_values=(('external_ids', {'iface-id': '2b93bbc2-5aeb-49cc-b610-6f4f7708d346'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.835 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost ovn_controller[156798]: 2026-02-20T09:51:30Z|00101|binding|INFO|Releasing lport 2b93bbc2-5aeb-49cc-b610-6f4f7708d346 from this chassis (sb_readonly=0) Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.848 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.849 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.849 162652 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.850 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[88598a3e-70f5-4e39-ac1a-abe6244f7798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.851 162652 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: global Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: log /dev/log local0 debug Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: log-tag haproxy-metadata-proxy-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: user root Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: group root Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: maxconn 1024 Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: pidfile /var/lib/neutron/external/pids/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.pid.haproxy Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: daemon Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: defaults Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: log global Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: mode http Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: option httplog Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: option dontlognull Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: option http-server-close Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: option forwardfor Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: retries 3 Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: timeout http-request 30s Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: timeout connect 30s Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: timeout client 32s Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: timeout server 32s Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: timeout http-keep-alive 30s Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: listen listener Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: bind 169.254.169.254:80 Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: server metadata /var/lib/neutron/metadata_proxy Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: http-request add-header X-OVN-Network-ID 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Feb 20 04:51:30 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:30.851 162652 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'env', 'PROCESS_TAG=haproxy-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Feb 20 04:51:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e101 e101: 6 total, 6 up, 6 in Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.956 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.956 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Started (Lifecycle Event)#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.980 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.984 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:51:30 localhost nova_compute[281288]: 2026-02-20 09:51:30.984 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Paused (Lifecycle Event)#033[00m Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.008 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.011 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.055 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Feb 20 04:51:31 localhost podman[308572]: Feb 20 04:51:31 localhost podman[308572]: 2026-02-20 09:51:31.333086595 +0000 UTC m=+0.073969709 container create fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:51:31 localhost systemd[1]: Started libpod-conmon-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19.scope. Feb 20 04:51:31 localhost systemd[1]: Started libcrun container. Feb 20 04:51:31 localhost podman[308572]: 2026-02-20 09:51:31.292752061 +0000 UTC m=+0.033635165 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 20 04:51:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7397e54adaacdacf436fa6c7d8a45f9bdf2c03bd965044011b1c1cebe1f2aa8f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:51:31 localhost podman[308572]: 2026-02-20 09:51:31.403184348 +0000 UTC m=+0.144067472 container init fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:51:31 localhost podman[308572]: 2026-02-20 09:51:31.412084074 +0000 UTC m=+0.152967188 container start fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:31 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE] (308590) : New worker (308592) forked Feb 20 04:51:31 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE] (308590) : Loading success. Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.487 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ce4822a0-5e7a-4c40-9856-6c8879a12ac7 in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc unbound from our chassis#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.491 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc126e7a-67b5-4025-9da6-7c8301672033 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.491 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9021dc49-7e01-42e7-8f32-572dec89afcc#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.500 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9bfdb9-9305-47eb-bebd-b66bff1ad939]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.502 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9021dc49-71 in ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.504 162782 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9021dc49-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.504 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[48a78f7a-3c13-4323-878d-d228b30da236]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.505 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[17b124a8-1df3-44b8-bc34-b9e1c409d54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.515 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0d2956-d073-42b2-828c-2ce24a6a1c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.528 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[268fae37-d796-4372-acbe-68c78fad7a22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.555 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[684b3015-f664-4bd1-91b7-c73626286232]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.563 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[aef0e6e7-6c2c-457e-99c2-58ddc6e4d032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost systemd-udevd[308484]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:51:31 localhost NetworkManager[5988]: [1771581091.5681] manager: (tap9021dc49-70): new Veth device (/org/freedesktop/NetworkManager/Devices/23) Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.600 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb7d4ec-4381-4286-acbd-ffe809fead05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.604 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[e5de424e-0375-4cf5-8b5a-7cec86115e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9021dc49-71: link becomes ready Feb 20 04:51:31 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9021dc49-70: link becomes ready Feb 20 04:51:31 localhost NetworkManager[5988]: [1771581091.6315] device (tap9021dc49-70): carrier: link connected Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.638 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9d07be-04b5-49c8-8044-bddc0851db86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.655 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[edd2cdcb-a97b-4629-a64c-de384cb85ecb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9021dc49-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:cd:0a:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1165080, 'reachable_time': 24304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308611, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.675 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec1ec04-f2fa-4271-a37b-7a32a3eacc03]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:a45'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1165080, 'tstamp': 1165080}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308612, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.693 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4a022fd3-b92b-44c9-af7b-385911879ae8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9021dc49-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:cd:0a:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1165080, 'reachable_time': 24304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308613, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.724 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[04a41fa0-3c53-4455-9924-dc5758b0f400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.788 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[950f4b1a-8dbf-4505-ba5c-1c370e53d559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.791 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9021dc49-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.791 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.792 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9021dc49-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:31 localhost kernel: device tap9021dc49-70 entered promiscuous mode Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.800 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9021dc49-70, col_values=(('external_ids', {'iface-id': '8069ffae-e153-4a3e-ac83-1cd290da58a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.803 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:31 localhost ovn_controller[156798]: 2026-02-20T09:51:31Z|00102|binding|INFO|Releasing lport 8069ffae-e153-4a3e-ac83-1cd290da58a3 from this chassis (sb_readonly=0) Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.804 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.805 162652 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9021dc49-7e01-42e7-8f32-572dec89afcc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9021dc49-7e01-42e7-8f32-572dec89afcc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.806 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2ecc98-51a4-4522-b479-cb68051ef3b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.807 162652 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: global Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: log /dev/log local0 debug Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: log-tag haproxy-metadata-proxy-9021dc49-7e01-42e7-8f32-572dec89afcc Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: user root Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: group root Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: maxconn 1024 Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: pidfile /var/lib/neutron/external/pids/9021dc49-7e01-42e7-8f32-572dec89afcc.pid.haproxy Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: daemon Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: defaults Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: log global Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: mode http Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: option httplog Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: option dontlognull Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: option http-server-close Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: option forwardfor Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: retries 3 Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: timeout http-request 30s Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: timeout connect 30s Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: timeout client 32s Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: timeout server 32s Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: timeout http-keep-alive 30s Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: listen listener Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: bind 169.254.169.254:80 Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: server metadata /var/lib/neutron/metadata_proxy Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: http-request add-header X-OVN-Network-ID 9021dc49-7e01-42e7-8f32-572dec89afcc Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Feb 20 04:51:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:31.809 162652 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'env', 'PROCESS_TAG=haproxy-9021dc49-7e01-42e7-8f32-572dec89afcc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9021dc49-7e01-42e7-8f32-572dec89afcc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Feb 20 04:51:31 localhost nova_compute[281288]: 2026-02-20 09:51:31.813 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:32.129 264355 INFO neutron.agent.linux.ip_lib [None req-6210fe24-3488-4e4e-8ec6-008288325c99 - - - - - -] Device tap21b010cc-c3 cannot be used as it has no MAC address#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost kernel: device tap21b010cc-c3 entered promiscuous mode Feb 20 04:51:32 localhost NetworkManager[5988]: [1771581092.1997] manager: (tap21b010cc-c3): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.203 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost ovn_controller[156798]: 2026-02-20T09:51:32Z|00103|binding|INFO|Claiming lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 for this chassis. Feb 20 04:51:32 localhost ovn_controller[156798]: 2026-02-20T09:51:32Z|00104|binding|INFO|21b010cc-c3ff-4013-97b7-6b7eb23e47a9: Claiming unknown Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.211 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:32.225 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e7d3e1cfe9f4e4d8451c6f0b8be3a29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04bb3800-97e8-42cd-83bb-692b59d74b62, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=21b010cc-c3ff-4013-97b7-6b7eb23e47a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:32 localhost ovn_controller[156798]: 2026-02-20T09:51:32Z|00105|binding|INFO|Setting lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 ovn-installed in OVS Feb 20 04:51:32 localhost ovn_controller[156798]: 2026-02-20T09:51:32Z|00106|binding|INFO|Setting lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 up in Southbound Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.245 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.297 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.324 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:32 localhost podman[308655]: Feb 20 04:51:32 localhost podman[308655]: 2026-02-20 09:51:32.348929209 +0000 UTC m=+0.117730377 container create 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:51:32 localhost systemd[1]: Started libpod-conmon-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c.scope. Feb 20 04:51:32 localhost podman[308655]: 2026-02-20 09:51:32.29439322 +0000 UTC m=+0.063194378 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 20 04:51:32 localhost systemd[1]: Started libcrun container. Feb 20 04:51:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427c548c5ea3a77d1146e79412578647e7513ef27a630d63200866643b6640c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:51:32 localhost podman[308655]: 2026-02-20 09:51:32.420862827 +0000 UTC m=+0.189663985 container init 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:51:32 localhost podman[308655]: 2026-02-20 09:51:32.430920317 +0000 UTC m=+0.199721475 container start 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:51:32 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE] (308683) : New worker (308688) forked Feb 20 04:51:32 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE] (308683) : Loading success. Feb 20 04:51:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:32.489 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 in datapath 09ccac50-3316-4f5e-b2ff-0e97a71903d8 unbound from our chassis#033[00m Feb 20 04:51:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:32.491 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 09ccac50-3316-4f5e-b2ff-0e97a71903d8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:51:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:32.492 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3d9853-cdc8-4f21-b7a3-50b152e999ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:32 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e102 e102: 6 total, 6 up, 6 in Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.793 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.793 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.794 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.795 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.795 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Processing event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.795 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.796 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.796 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.797 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.797 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.798 281292 WARNING nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received unexpected event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with vm_state building and task_state spawning.#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.799 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.804 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.804 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Resumed (Lifecycle Event)#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.808 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.813 281292 INFO nova.virt.libvirt.driver [-] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance spawned successfully.#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.813 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.826 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.836 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.842 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.842 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.843 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.844 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.844 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.845 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.874 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.922 281292 INFO nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Took 8.52 seconds to spawn the instance on the hypervisor.#033[00m Feb 20 04:51:32 localhost nova_compute[281288]: 2026-02-20 09:51:32.923 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:51:33 localhost nova_compute[281288]: 2026-02-20 09:51:33.009 281292 INFO nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Took 10.03 seconds to build instance.#033[00m Feb 20 04:51:33 localhost nova_compute[281288]: 2026-02-20 09:51:33.028 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:33 localhost podman[308736]: Feb 20 04:51:33 localhost podman[308736]: 2026-02-20 09:51:33.282148174 +0000 UTC m=+0.115715956 container create aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:51:33 localhost systemd[1]: Started libpod-conmon-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19.scope. Feb 20 04:51:33 localhost podman[308736]: 2026-02-20 09:51:33.229687028 +0000 UTC m=+0.063254820 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:51:33 localhost systemd[1]: Started libcrun container. Feb 20 04:51:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/422ec30970010cad66130b89158fe73344869aca88877d7c2bd64592bab9a6ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:51:33 localhost podman[308736]: 2026-02-20 09:51:33.348805995 +0000 UTC m=+0.182373777 container init aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:33 localhost podman[308736]: 2026-02-20 09:51:33.355530995 +0000 UTC m=+0.189098777 container start aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:51:33 localhost dnsmasq[308755]: started, version 2.85 cachesize 150 Feb 20 04:51:33 localhost dnsmasq[308755]: DNS service limited to local subnets Feb 20 04:51:33 localhost dnsmasq[308755]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:51:33 localhost dnsmasq[308755]: warning: no upstream servers configured Feb 20 04:51:33 localhost dnsmasq-dhcp[308755]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:51:33 localhost dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 0 addresses Feb 20 04:51:33 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host Feb 20 04:51:33 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts Feb 20 04:51:33 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:33.787 264355 INFO neutron.agent.dhcp.agent [None req-e25eb512-de70-4ae6-b848-7918d42efd54 - - - - - -] DHCP configuration for ports {'cfc50a35-d356-47aa-8376-a7f780a8f1d2'} is completed#033[00m Feb 20 04:51:34 localhost nova_compute[281288]: 2026-02-20 09:51:34.698 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:51:34 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.976576) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581094976628, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2473, "num_deletes": 254, "total_data_size": 3616053, "memory_usage": 3674736, "flush_reason": "Manual Compaction"} Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581094989045, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 2342375, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14585, "largest_seqno": 17053, "table_properties": {"data_size": 2333652, "index_size": 5356, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19191, "raw_average_key_size": 20, "raw_value_size": 2315578, "raw_average_value_size": 2522, "num_data_blocks": 236, "num_entries": 918, "num_filter_entries": 918, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580912, "oldest_key_time": 1771580912, "file_creation_time": 1771581094, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 12518 microseconds, and 6189 cpu microseconds. Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.989102) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 2342375 bytes OK Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.989128) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.991091) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.991113) EVENT_LOG_v1 {"time_micros": 1771581094991107, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.991139) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 3605076, prev total WAL file size 3605076, number of live WAL files 2. Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.992138) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(2287KB)], [18(17MB)] Feb 20 04:51:34 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581094992180, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 20810879, "oldest_snapshot_seqno": -1} Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 12289 keys, 18890509 bytes, temperature: kUnknown Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581095078718, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 18890509, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18818435, "index_size": 40229, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30789, "raw_key_size": 327851, "raw_average_key_size": 26, "raw_value_size": 18607204, "raw_average_value_size": 1514, "num_data_blocks": 1543, "num_entries": 12289, "num_filter_entries": 12289, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581094, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.079113) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 18890509 bytes Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.080843) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 240.1 rd, 218.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 17.6 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(16.9) write-amplify(8.1) OK, records in: 12819, records dropped: 530 output_compression: NoCompression Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.080876) EVENT_LOG_v1 {"time_micros": 1771581095080863, "job": 8, "event": "compaction_finished", "compaction_time_micros": 86659, "compaction_time_cpu_micros": 45281, "output_level": 6, "num_output_files": 1, "total_output_size": 18890509, "num_input_records": 12819, "num_output_records": 12289, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581095081322, "job": 8, "event": "table_file_deletion", "file_number": 20} Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581095083753, "job": 8, "event": "table_file_deletion", "file_number": 18} Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.992064) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:51:35 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083856) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:51:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:36 localhost nova_compute[281288]: 2026-02-20 09:51:36.095 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Check if temp file /var/lib/nova/instances/tmp4xew7c85 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m Feb 20 04:51:36 localhost nova_compute[281288]: 2026-02-20 09:51:36.096 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] source check data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp4xew7c85',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='90eb8d1f-8d13-4395-9d15-67fdaa60632d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m Feb 20 04:51:36 localhost nova_compute[281288]: 2026-02-20 09:51:36.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:37 localhost nova_compute[281288]: 2026-02-20 09:51:37.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:37.267 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:36Z, description=, device_id=ae5f315b-79d2-4264-afec-ecf48cf37c1f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2ab664a2-00a2-4e97-877c-3854355c736b, ip_allocation=immediate, mac_address=fa:16:3e:77:fd:60, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:30Z, description=, dns_domain=, id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-43118640-network, port_security_enabled=True, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23356, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=586, status=ACTIVE, subnets=['728af7d3-4d21-4f1d-9b4d-37b28d1c9bfa'], tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:31Z, vlan_transparent=None, network_id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, port_security_enabled=False, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=633, status=DOWN, tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:37Z on network 09ccac50-3316-4f5e-b2ff-0e97a71903d8#033[00m Feb 20 04:51:37 localhost nova_compute[281288]: 2026-02-20 09:51:37.478 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:51:37 localhost nova_compute[281288]: 2026-02-20 09:51:37.478 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:51:37 localhost nova_compute[281288]: 2026-02-20 09:51:37.484 281292 INFO nova.compute.rpcapi [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m Feb 20 04:51:37 localhost nova_compute[281288]: 2026-02-20 09:51:37.484 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:51:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e103 e103: 6 total, 6 up, 6 in Feb 20 04:51:37 localhost dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 1 addresses Feb 20 04:51:37 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host Feb 20 04:51:37 localhost podman[308858]: 2026-02-20 09:51:37.609254133 +0000 UTC m=+0.079742853 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:51:37 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts Feb 20 04:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:51:37 localhost systemd[1]: tmp-crun.15nAlN.mount: Deactivated successfully. Feb 20 04:51:37 localhost systemd[1]: tmp-crun.FBPpLS.mount: Deactivated successfully. Feb 20 04:51:37 localhost podman[308870]: 2026-02-20 09:51:37.737077409 +0000 UTC m=+0.110561822 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true) Feb 20 04:51:37 localhost podman[308870]: 2026-02-20 09:51:37.748145999 +0000 UTC m=+0.121630392 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:37 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:51:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:37.964 264355 INFO neutron.agent.dhcp.agent [None req-c3385804-a075-451a-a30a-4dd8dfdb3504 - - - - - -] DHCP configuration for ports {'2ab664a2-00a2-4e97-877c-3854355c736b'} is completed#033[00m Feb 20 04:51:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:38.238 264355 INFO neutron.agent.linux.ip_lib [None req-4ef5b863-73bf-48af-bf17-945910944163 - - - - - -] Device tapf56de90b-39 cannot be used as it has no MAC address#033[00m Feb 20 04:51:38 localhost nova_compute[281288]: 2026-02-20 09:51:38.307 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:38 localhost kernel: device tapf56de90b-39 entered promiscuous mode Feb 20 04:51:38 localhost NetworkManager[5988]: [1771581098.3170] manager: (tapf56de90b-39): new Generic device (/org/freedesktop/NetworkManager/Devices/25) Feb 20 04:51:38 localhost nova_compute[281288]: 2026-02-20 09:51:38.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:38 localhost systemd-udevd[308904]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:51:38 localhost ovn_controller[156798]: 2026-02-20T09:51:38Z|00107|binding|INFO|Claiming lport f56de90b-39da-4f5e-beb2-23f63fa15081 for this chassis. Feb 20 04:51:38 localhost ovn_controller[156798]: 2026-02-20T09:51:38Z|00108|binding|INFO|f56de90b-39da-4f5e-beb2-23f63fa15081: Claiming unknown Feb 20 04:51:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:38.329 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bf3a16481834e3a81e04ea40bee1d8d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26340ad6-3c33-4a82-9f2f-3413cbaaea9f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f56de90b-39da-4f5e-beb2-23f63fa15081) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:38.331 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f56de90b-39da-4f5e-beb2-23f63fa15081 in datapath 71b28781-95be-4ab4-86ca-7c852dd117aa bound to our chassis#033[00m Feb 20 04:51:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:38.333 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 71b28781-95be-4ab4-86ca-7c852dd117aa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:51:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:38.334 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[83e823f5-1d64-4fd2-a6a2-680d914d826e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost ovn_controller[156798]: 2026-02-20T09:51:38Z|00109|binding|INFO|Setting lport f56de90b-39da-4f5e-beb2-23f63fa15081 ovn-installed in OVS Feb 20 04:51:38 localhost ovn_controller[156798]: 2026-02-20T09:51:38Z|00110|binding|INFO|Setting lport f56de90b-39da-4f5e-beb2-23f63fa15081 up in Southbound Feb 20 04:51:38 localhost nova_compute[281288]: 2026-02-20 09:51:38.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost journal[229984]: ethtool ioctl error on tapf56de90b-39: No such device Feb 20 04:51:38 localhost nova_compute[281288]: 2026-02-20 09:51:38.397 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:38 localhost nova_compute[281288]: 2026-02-20 09:51:38.427 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:38 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:51:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:39.182 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:36Z, description=, device_id=ae5f315b-79d2-4264-afec-ecf48cf37c1f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2ab664a2-00a2-4e97-877c-3854355c736b, ip_allocation=immediate, mac_address=fa:16:3e:77:fd:60, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:30Z, description=, dns_domain=, id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-43118640-network, port_security_enabled=True, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23356, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=586, status=ACTIVE, subnets=['728af7d3-4d21-4f1d-9b4d-37b28d1c9bfa'], tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:31Z, vlan_transparent=None, network_id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, port_security_enabled=False, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=633, status=DOWN, tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:37Z on network 09ccac50-3316-4f5e-b2ff-0e97a71903d8#033[00m Feb 20 04:51:39 localhost podman[308977]: Feb 20 04:51:39 localhost podman[308977]: 2026-02-20 09:51:39.378238343 +0000 UTC m=+0.112781908 container create 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 20 04:51:39 localhost systemd[1]: Started libpod-conmon-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f.scope. Feb 20 04:51:39 localhost podman[308977]: 2026-02-20 09:51:39.328815407 +0000 UTC m=+0.063358972 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:51:39 localhost podman[309003]: 2026-02-20 09:51:39.449911714 +0000 UTC m=+0.074522227 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:51:39 localhost dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 1 addresses Feb 20 04:51:39 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host Feb 20 04:51:39 localhost systemd[1]: Started libcrun container. Feb 20 04:51:39 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts Feb 20 04:51:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/251cfd009f8c3881b0902462fc67fb4f6f911df6e30e6a2f4b7d252da92d4521/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:51:39 localhost podman[308977]: 2026-02-20 09:51:39.473881479 +0000 UTC m=+0.208425004 container init 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:51:39 localhost podman[308977]: 2026-02-20 09:51:39.480090764 +0000 UTC m=+0.214634289 container start 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:51:39 localhost dnsmasq[309024]: started, version 2.85 cachesize 150 Feb 20 04:51:39 localhost dnsmasq[309024]: DNS service limited to local subnets Feb 20 04:51:39 localhost dnsmasq[309024]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:51:39 localhost dnsmasq[309024]: warning: no upstream servers configured Feb 20 04:51:39 localhost dnsmasq-dhcp[309024]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:51:39 localhost dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 0 addresses Feb 20 04:51:39 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host Feb 20 04:51:39 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts Feb 20 04:51:39 localhost nova_compute[281288]: 2026-02-20 09:51:39.579 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:39 localhost systemd[1]: tmp-crun.73uX7D.mount: Deactivated successfully. Feb 20 04:51:39 localhost nova_compute[281288]: 2026-02-20 09:51:39.700 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:39.774 264355 INFO neutron.agent.dhcp.agent [None req-c7498486-8851-4713-90d3-8f2a3e59c45c - - - - - -] DHCP configuration for ports {'2ab664a2-00a2-4e97-877c-3854355c736b', '92767216-7fcd-4a1f-a5c1-2c5d4ba6339b'} is completed#033[00m Feb 20 04:51:40 localhost nova_compute[281288]: 2026-02-20 09:51:40.489 281292 DEBUG nova.compute.manager [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:40 localhost nova_compute[281288]: 2026-02-20 09:51:40.490 281292 DEBUG oslo_concurrency.lockutils [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:40 localhost nova_compute[281288]: 2026-02-20 09:51:40.490 281292 DEBUG oslo_concurrency.lockutils [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:40 localhost nova_compute[281288]: 2026-02-20 09:51:40.491 281292 DEBUG oslo_concurrency.lockutils [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:40 localhost nova_compute[281288]: 2026-02-20 09:51:40.491 281292 DEBUG nova.compute.manager [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:51:40 localhost nova_compute[281288]: 2026-02-20 09:51:40.492 281292 DEBUG nova.compute.manager [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Feb 20 04:51:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.691 281292 INFO nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Took 4.21 seconds for pre_live_migration on destination host np0005625202.localdomain.#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.693 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.725 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp4xew7c85',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='90eb8d1f-8d13-4395-9d15-67fdaa60632d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a8047bd1-acc9-47b2-a05d-5e7eb6222d12),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.731 281292 DEBUG nova.objects.instance [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lazy-loading 'migration_context' on Instance uuid 90eb8d1f-8d13-4395-9d15-67fdaa60632d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.734 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.736 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.737 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.771 281292 DEBUG nova.virt.libvirt.vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T09:51:32Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2026-02-20T09:51:32Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.772 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.773 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.775 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating guest XML with vif config: Feb 20 04:51:41 localhost nova_compute[281288]: Feb 20 04:51:41 localhost nova_compute[281288]: Feb 20 04:51:41 localhost nova_compute[281288]: Feb 20 04:51:41 localhost nova_compute[281288]: Feb 20 04:51:41 localhost nova_compute[281288]: Feb 20 04:51:41 localhost nova_compute[281288]: Feb 20 04:51:41 localhost nova_compute[281288]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m Feb 20 04:51:41 localhost nova_compute[281288]: 2026-02-20 09:51:41.777 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.240 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.240 281292 INFO nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.274 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.321 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.743 281292 WARNING nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received unexpected event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with vm_state active and task_state migrating.#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing instance network info cache due to event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.744 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquired lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.744 281292 DEBUG nova.network.neutron [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.825 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.825 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m Feb 20 04:51:42 localhost nova_compute[281288]: 2026-02-20 09:51:42.938 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.373 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.373 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Paused (Lifecycle Event)#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.376 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.377 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.399 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.404 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.429 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m Feb 20 04:51:43 localhost kernel: device tap609a0699-87 left promiscuous mode Feb 20 04:51:43 localhost NetworkManager[5988]: [1771581103.5514] device (tap609a0699-87): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00111|binding|INFO|Releasing lport 609a0699-8716-4bf8-9f50-bfeec5f65721 from this chassis (sb_readonly=0) Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.569 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00112|binding|INFO|Setting lport 609a0699-8716-4bf8-9f50-bfeec5f65721 down in Southbound Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00113|binding|INFO|Releasing lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 from this chassis (sb_readonly=0) Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00114|binding|INFO|Setting lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 down in Southbound Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00115|binding|INFO|Removing iface tap609a0699-87 ovn-installed in OVS Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.595 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:a3:f9 10.100.0.12'], port_security=['fa:16:3e:c0:a3:f9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain,np0005625202.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '0a83b6be-9fe2-42ef-8768-88847d97b165'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-420346976', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '90eb8d1f-8d13-4395-9d15-67fdaa60632d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-420346976', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=609a0699-8716-4bf8-9f50-bfeec5f65721) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00116|binding|INFO|Releasing lport 2b93bbc2-5aeb-49cc-b610-6f4f7708d346 from this chassis (sb_readonly=0) Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00117|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:51:43 localhost ovn_controller[156798]: 2026-02-20T09:51:43Z|00118|binding|INFO|Releasing lport 8069ffae-e153-4a3e-ac83-1cd290da58a3 from this chassis (sb_readonly=0) Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.599 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:22:88 19.80.0.55'], port_security=['fa:16:3e:ef:22:88 19.80.0.55'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['609a0699-8716-4bf8-9f50-bfeec5f65721'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-288633192', 'neutron:cidrs': '19.80.0.55/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-288633192', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ce4822a0-5e7a-4c40-9856-6c8879a12ac7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.602 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 609a0699-8716-4bf8-9f50-bfeec5f65721 in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 unbound from our chassis#033[00m Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.608 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8d59c69c-3a69-449e-9d36-233c1f4c5c30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.608 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.609 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f2b38e-75fd-4e8a-b447-0b9b98d60c24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:43 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:43.610 162652 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 namespace which is not needed anymore#033[00m Feb 20 04:51:43 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully. Feb 20 04:51:43 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 10.458s CPU time. Feb 20 04:51:43 localhost systemd-machined[85698]: Machine qemu-3-instance-00000008 terminated. Feb 20 04:51:43 localhost journal[206495]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk: No such file or directory Feb 20 04:51:43 localhost journal[206495]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk: No such file or directory Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.705 281292 DEBUG nova.network.neutron [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updated VIF entry in instance network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.706 281292 DEBUG nova.network.neutron [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating instance_info_cache with network_info: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005625202.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:51:43 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:43.721 2 INFO neutron.agent.securitygroups_rpc [req-bc88a03e-b48b-4063-bf3f-e91bcc37d72d req-9e63afcc-d40a-4b2c-a3aa-f230d65e4db2 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['d4aeef42-5959-493a-9cfc-ec0d9adb0b00']#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.753 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.753 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.753 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.762 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Releasing lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:51:43 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE] (308590) : haproxy version is 2.8.14-c23fe91 Feb 20 04:51:43 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE] (308590) : path to executable is /usr/sbin/haproxy Feb 20 04:51:43 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [WARNING] (308590) : Exiting Master process... Feb 20 04:51:43 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [ALERT] (308590) : Current worker (308592) exited with code 143 (Terminated) Feb 20 04:51:43 localhost neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [WARNING] (308590) : All workers exited. Exiting... (0) Feb 20 04:51:43 localhost systemd[1]: libpod-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19.scope: Deactivated successfully. Feb 20 04:51:43 localhost podman[309059]: 2026-02-20 09:51:43.78972011 +0000 UTC m=+0.071729233 container died fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:51:43 localhost systemd[1]: tmp-crun.DZLg20.mount: Deactivated successfully. Feb 20 04:51:43 localhost podman[309059]: 2026-02-20 09:51:43.842580809 +0000 UTC m=+0.124589852 container cleanup fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 20 04:51:43 localhost podman[309080]: 2026-02-20 09:51:43.860779532 +0000 UTC m=+0.063843578 container cleanup fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:51:43 localhost systemd[1]: libpod-conmon-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19.scope: Deactivated successfully. Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.879 281292 DEBUG nova.virt.libvirt.guest [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '90eb8d1f-8d13-4395-9d15-67fdaa60632d' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.881 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration operation has completed#033[00m Feb 20 04:51:43 localhost nova_compute[281288]: 2026-02-20 09:51:43.882 281292 INFO nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] _post_live_migration() is started..#033[00m Feb 20 04:51:44 localhost sshd[309107]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:51:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:44.646 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:43Z, description=, device_id=330257d1-c627-4905-9230-185815fc6ffb, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3, ip_allocation=immediate, mac_address=fa:16:3e:c0:25:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:34Z, description=, dns_domain=, id=71b28781-95be-4ab4-86ca-7c852dd117aa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-478398302-network, port_security_enabled=True, project_id=3bf3a16481834e3a81e04ea40bee1d8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7731, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=623, status=ACTIVE, subnets=['0e07ae8b-e5fb-4d37-99ed-b66cb0c0c73e'], tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:36Z, vlan_transparent=None, network_id=71b28781-95be-4ab4-86ca-7c852dd117aa, port_security_enabled=False, project_id=3bf3a16481834e3a81e04ea40bee1d8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=661, status=DOWN, tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:44Z on network 71b28781-95be-4ab4-86ca-7c852dd117aa#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:44 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:44.772 2 INFO neutron.agent.securitygroups_rpc [req-d0af9ee5-c34c-498a-a79b-d6b681e80e4a req-6395f2ff-4ffd-4bf4-8fd0-e88e3c18ce7a 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['f1d2b747-b5b9-4577-9543-577b07c94aaa']#033[00m Feb 20 04:51:44 localhost systemd[1]: var-lib-containers-storage-overlay-7397e54adaacdacf436fa6c7d8a45f9bdf2c03bd965044011b1c1cebe1f2aa8f-merged.mount: Deactivated successfully. Feb 20 04:51:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19-userdata-shm.mount: Deactivated successfully. Feb 20 04:51:44 localhost podman[309094]: 2026-02-20 09:51:44.790614666 +0000 UTC m=+0.929954439 container remove fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.798 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[082898fc-7673-464a-bed4-e62f1309e41f]: (4, ('Fri Feb 20 09:51:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 (fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19)\nfa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19\nFri Feb 20 09:51:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 (fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19)\nfa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.800 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac84158-22ba-41f9-b68d-3a3558805402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.802 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51f8ae9c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.805 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:44 localhost kernel: device tap51f8ae9c-10 left promiscuous mode Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.820 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.825 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[27d52f11-5292-4b17-b2bb-65677c4405fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.846 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[95ccfdbf-8236-4b75-a0e8-be334f8f3dc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.847 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d91d2cea-ee8c-495f-9344-aac73411b212]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.861 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6f9e2c-1b07-4a34-9563-9ee51f1968b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164973, 'reachable_time': 41135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309141, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost systemd[1]: run-netns-ovnmeta\x2d51f8ae9c\x2d1ccc\x2d4ec5\x2d8a06\x2d5c7802ad29e0.mount: Deactivated successfully. Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.863 163070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.863 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb70769-67fd-49a5-ae10-f12419da1355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.864 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ce4822a0-5e7a-4c40-9856-6c8879a12ac7 in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc unbound from our chassis#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.868 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc126e7a-67b5-4025-9da6-7c8301672033 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.868 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9021dc49-7e01-42e7-8f32-572dec89afcc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.869 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[44cf62c5-e5f2-430c-9d1a-3ab98419de19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:44.870 162652 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc namespace which is not needed anymore#033[00m Feb 20 04:51:44 localhost dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 1 addresses Feb 20 04:51:44 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host Feb 20 04:51:44 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts Feb 20 04:51:44 localhost podman[309128]: 2026-02-20 09:51:44.895366193 +0000 UTC m=+0.064979591 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.918 281292 DEBUG nova.compute.manager [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.919 281292 DEBUG oslo_concurrency.lockutils [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.920 281292 DEBUG oslo_concurrency.lockutils [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.921 281292 DEBUG oslo_concurrency.lockutils [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.921 281292 DEBUG nova.compute.manager [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 20 04:51:44 localhost nova_compute[281288]: 2026-02-20 09:51:44.921 281292 DEBUG nova.compute.manager [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Feb 20 04:51:45 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE] (308683) : haproxy version is 2.8.14-c23fe91 Feb 20 04:51:45 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE] (308683) : path to executable is /usr/sbin/haproxy Feb 20 04:51:45 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [WARNING] (308683) : Exiting Master process... Feb 20 04:51:45 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [WARNING] (308683) : Exiting Master process... Feb 20 04:51:45 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [ALERT] (308683) : Current worker (308688) exited with code 143 (Terminated) Feb 20 04:51:45 localhost neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [WARNING] (308683) : All workers exited. Exiting... (0) Feb 20 04:51:45 localhost systemd[1]: libpod-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c.scope: Deactivated successfully. Feb 20 04:51:45 localhost podman[309167]: 2026-02-20 09:51:45.064386011 +0000 UTC m=+0.073972761 container died 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.108 281292 DEBUG nova.network.neutron [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Activated binding for port 609a0699-8716-4bf8-9f50-bfeec5f65721 and host np0005625202.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.109 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.111 281292 DEBUG nova.virt.libvirt.vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-20T09:51:32Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2026-02-20T09:51:35Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.112 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.113 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.113 281292 DEBUG os_vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Feb 20 04:51:45 localhost podman[309167]: 2026-02-20 09:51:45.11459252 +0000 UTC m=+0.124179260 container cleanup 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.116 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.116 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap609a0699-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.118 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.120 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.124 281292 INFO os_vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87')#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.125 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.125 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.126 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.126 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.127 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Deleting instance files /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d_del#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.127 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Deletion of /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d_del complete#033[00m Feb 20 04:51:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:45.136 264355 INFO neutron.agent.dhcp.agent [None req-5f3762c0-1cb3-459e-abff-392fbcbf26c3 - - - - - -] DHCP configuration for ports {'ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3'} is completed#033[00m Feb 20 04:51:45 localhost podman[309182]: 2026-02-20 09:51:45.14945183 +0000 UTC m=+0.077951798 container cleanup 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:51:45 localhost systemd[1]: libpod-conmon-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c.scope: Deactivated successfully. Feb 20 04:51:45 localhost podman[309198]: 2026-02-20 09:51:45.219598775 +0000 UTC m=+0.080784773 container remove 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.224 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[060b3405-9785-4a34-9279-55da535def80]: (4, ('Fri Feb 20 09:51:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc (941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c)\n941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c\nFri Feb 20 09:51:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc (941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c)\n941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.226 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4db536ff-5e02-4037-bcd9-0703fd85878a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.227 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9021dc49-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:45 localhost kernel: device tap9021dc49-70 left promiscuous mode Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.233 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.238 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[aff20123-d610-416f-9de3-fb8208236ef2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost nova_compute[281288]: 2026-02-20 09:51:45.243 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.260 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[979fe4d1-9021-4f27-ab7e-9e62d607f9a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.262 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[f10fc171-c42c-44e1-b7cd-03de60fceaf4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.279 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[07623aa5-31f9-4518-af3e-1ace6320e407]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1165072, 'reachable_time': 39996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309218, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.281 163070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Feb 20 04:51:45 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:45.282 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[c112abee-704b-41f9-84f4-0b56d0b5019c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:45 localhost systemd[1]: var-lib-containers-storage-overlay-427c548c5ea3a77d1146e79412578647e7513ef27a630d63200866643b6640c1-merged.mount: Deactivated successfully. Feb 20 04:51:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c-userdata-shm.mount: Deactivated successfully. Feb 20 04:51:45 localhost systemd[1]: run-netns-ovnmeta\x2d9021dc49\x2d7e01\x2d42e7\x2d8f32\x2d572dec89afcc.mount: Deactivated successfully. Feb 20 04:51:46 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:46.580 2 INFO neutron.agent.securitygroups_rpc [req-d0a4da6f-de87-4709-b506-6d507f2fa68b req-a4293e8b-cdcb-4f0f-b9af-77766c0f126a 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['b0641abe-7ec2-4391-9e24-125339c7b7ee']#033[00m Feb 20 04:51:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:51:47 localhost podman[309219]: 2026-02-20 09:51:47.116783425 +0000 UTC m=+0.058536798 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:51:47 localhost podman[309219]: 2026-02-20 09:51:47.129065532 +0000 UTC m=+0.070818915 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:51:47 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:51:47 localhost nova_compute[281288]: 2026-02-20 09:51:47.310 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:47 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:47.317 2 INFO neutron.agent.securitygroups_rpc [req-bca898f3-80d6-4116-8407-48ccb221c91a req-faa58b52-ff7a-4794-95e2-54e55cdad610 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['57ce2b3f-bfcc-424f-be8f-efa4d8d83e67']#033[00m Feb 20 04:51:47 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:47.588 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:43Z, description=, device_id=330257d1-c627-4905-9230-185815fc6ffb, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3, ip_allocation=immediate, mac_address=fa:16:3e:c0:25:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:34Z, description=, dns_domain=, id=71b28781-95be-4ab4-86ca-7c852dd117aa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-478398302-network, port_security_enabled=True, project_id=3bf3a16481834e3a81e04ea40bee1d8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7731, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=623, status=ACTIVE, subnets=['0e07ae8b-e5fb-4d37-99ed-b66cb0c0c73e'], tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:36Z, vlan_transparent=None, network_id=71b28781-95be-4ab4-86ca-7c852dd117aa, port_security_enabled=False, project_id=3bf3a16481834e3a81e04ea40bee1d8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=661, status=DOWN, tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:44Z on network 71b28781-95be-4ab4-86ca-7c852dd117aa#033[00m Feb 20 04:51:47 localhost podman[241968]: time="2026-02-20T09:51:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:51:47 localhost podman[241968]: @ - - [20/Feb/2026:09:51:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162550 "" "Go-http-client/1.1" Feb 20 04:51:47 localhost podman[241968]: @ - - [20/Feb/2026:09:51:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20242 "" "Go-http-client/1.1" Feb 20 04:51:47 localhost dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 1 addresses Feb 20 04:51:47 localhost podman[309259]: 2026-02-20 09:51:47.84433328 +0000 UTC m=+0.079824094 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:51:47 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host Feb 20 04:51:47 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts Feb 20 04:51:47 localhost nova_compute[281288]: 2026-02-20 09:51:47.898 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:48.084 264355 INFO neutron.agent.dhcp.agent [None req-1ecfbbd2-7e17-47cc-a1b0-53717d9eb847 - - - - - -] DHCP configuration for ports {'ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3'} is completed#033[00m Feb 20 04:51:48 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:48.224 2 INFO neutron.agent.securitygroups_rpc [req-9dee97d4-d9a8-4ee1-93af-ecec75edb6d8 req-b55f0bec-8773-478c-b205-7d4f6dd0e50e 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['35812fcc-3257-4b87-b011-2e502647727b']#033[00m Feb 20 04:51:48 localhost snmpd[68593]: empty variable list in _query Feb 20 04:51:48 localhost snmpd[68593]: empty variable list in _query Feb 20 04:51:48 localhost snmpd[68593]: empty variable list in _query Feb 20 04:51:48 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:48.489 2 INFO neutron.agent.securitygroups_rpc [req-3e7d9994-1b77-4e20-ab05-14f19dff3953 req-ef6c6054-1e67-499f-b1bf-b8ae592974a9 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['35812fcc-3257-4b87-b011-2e502647727b']#033[00m Feb 20 04:51:49 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:49.164 2 INFO neutron.agent.securitygroups_rpc [req-fdb7e361-ff4f-4f47-a1a0-e5e8ae6f1fbe req-da6cfbf4-9203-474f-ad26-64056962735b 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['35812fcc-3257-4b87-b011-2e502647727b']#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.342 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.344 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.344 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.407 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.407 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.408 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.408 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.409 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:49 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e104 e104: 6 total, 6 up, 6 in Feb 20 04:51:49 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:51:49 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/552899203' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.857 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.929 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:51:49 localhost nova_compute[281288]: 2026-02-20 09:51:49.930 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:51:49 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:49.948 2 INFO neutron.agent.securitygroups_rpc [None req-3fe948ea-c8e6-429c-836a-702342b0e4ac 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group rule updated ['4439e19b-bf91-4420-aff1-6854f961fef4']#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.120 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.152 281292 WARNING nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.154 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11387MB free_disk=41.42898178100586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.155 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.156 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.200 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Migration for instance 90eb8d1f-8d13-4395-9d15-67fdaa60632d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.222 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.255 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.256 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Migration a8047bd1-acc9-47b2-a05d-5e7eb6222d12 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.256 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.257 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:51:50 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:50.270 2 INFO neutron.agent.securitygroups_rpc [None req-e1dc84f3-2fa9-4dac-a092-2cb427ae3321 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group rule updated ['4439e19b-bf91-4420-aff1-6854f961fef4']#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.323 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:51:50 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3199515391' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.828 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.834 281292 DEBUG nova.compute.provider_tree [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.861 281292 DEBUG nova.scheduler.client.report [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.903 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.904 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:50 localhost nova_compute[281288]: 2026-02-20 09:51:50.924 281292 INFO nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migrating instance to np0005625202.localdomain finished successfully.#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.049 281292 INFO nova.scheduler.client.report [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Deleted allocation for migration a8047bd1-acc9-47b2-a05d-5e7eb6222d12#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.049 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.747 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.748 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:51:51 localhost nova_compute[281288]: 2026-02-20 09:51:51.748 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e105 e105: 6 total, 6 up, 6 in Feb 20 04:51:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:51:52 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4266218097' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.198 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.274 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.337 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.498 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.500 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11344MB free_disk=41.428184509277344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.501 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.501 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.563 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.563 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.564 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:51:52 localhost nova_compute[281288]: 2026-02-20 09:51:52.606 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:51:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e106 e106: 6 total, 6 up, 6 in Feb 20 04:51:53 localhost dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 0 addresses Feb 20 04:51:53 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host Feb 20 04:51:53 localhost dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts Feb 20 04:51:53 localhost podman[309382]: 2026-02-20 09:51:53.025249982 +0000 UTC m=+0.059084987 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:51:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:51:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:51:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3716024828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.116 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.122 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:51:53 localhost podman[309398]: 2026-02-20 09:51:53.150631265 +0000 UTC m=+0.094213694 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:51:53 localhost podman[309398]: 2026-02-20 09:51:53.170165338 +0000 UTC m=+0.113723146 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.182 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.184 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:51:53 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.185 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:51:53 localhost kernel: device tapf56de90b-39 left promiscuous mode Feb 20 04:51:53 localhost ovn_controller[156798]: 2026-02-20T09:51:53Z|00119|binding|INFO|Releasing lport f56de90b-39da-4f5e-beb2-23f63fa15081 from this chassis (sb_readonly=0) Feb 20 04:51:53 localhost ovn_controller[156798]: 2026-02-20T09:51:53Z|00120|binding|INFO|Setting lport f56de90b-39da-4f5e-beb2-23f63fa15081 down in Southbound Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.281 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:53.293 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bf3a16481834e3a81e04ea40bee1d8d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26340ad6-3c33-4a82-9f2f-3413cbaaea9f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f56de90b-39da-4f5e-beb2-23f63fa15081) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:53.295 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f56de90b-39da-4f5e-beb2-23f63fa15081 in datapath 71b28781-95be-4ab4-86ca-7c852dd117aa unbound from our chassis#033[00m Feb 20 04:51:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:53.300 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 71b28781-95be-4ab4-86ca-7c852dd117aa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:53 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:53.301 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[f44705ee-999d-4348-a543-2bf332b51ec1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:53 localhost nova_compute[281288]: 2026-02-20 09:51:53.307 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:53.666 2 INFO neutron.agent.securitygroups_rpc [None req-2028af40-368e-4b25-90de-8401d53be72c 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']#033[00m Feb 20 04:51:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:51:54 localhost podman[309430]: 2026-02-20 09:51:54.155855241 +0000 UTC m=+0.086885216 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1770267347, version=9.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 20 04:51:54 localhost podman[309430]: 2026-02-20 09:51:54.19801301 +0000 UTC m=+0.129042965 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:51:54 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:51:55 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:55.103 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:15Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=609a0699-8716-4bf8-9f50-bfeec5f65721, ip_allocation=immediate, mac_address=fa:16:3e:c0:a3:f9, name=tempest-parent-420346976, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=14, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=521, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, trunk_details=sub_ports=[], trunk_id=bb723cd7-ac34-46b0-bf66-79c7ed1fe96f, updated_at=2026-02-20T09:51:52Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.172 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.180 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.181 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.203 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:55 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:55.204 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.204 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:55 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:55.205 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.206 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:55 localhost systemd[1]: tmp-crun.RFGBpS.mount: Deactivated successfully. Feb 20 04:51:55 localhost podman[309463]: 2026-02-20 09:51:55.347032629 +0000 UTC m=+0.061676252 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:51:55 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 2 addresses Feb 20 04:51:55 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:51:55 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:51:55 localhost dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 0 addresses Feb 20 04:51:55 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host Feb 20 04:51:55 localhost dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts Feb 20 04:51:55 localhost podman[309497]: 2026-02-20 09:51:55.471819096 +0000 UTC m=+0.042592563 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:51:55 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:55.541 264355 INFO neutron.agent.dhcp.agent [None req-3217b94a-ac56-4875-a052-545d312cdafa - - - - - -] DHCP configuration for ports {'609a0699-8716-4bf8-9f50-bfeec5f65721'} is completed#033[00m Feb 20 04:51:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:55 localhost nova_compute[281288]: 2026-02-20 09:51:55.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:56 localhost ovn_controller[156798]: 2026-02-20T09:51:56Z|00121|binding|INFO|Releasing lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 from this chassis (sb_readonly=0) Feb 20 04:51:56 localhost kernel: device tap21b010cc-c3 left promiscuous mode Feb 20 04:51:56 localhost ovn_controller[156798]: 2026-02-20T09:51:56Z|00122|binding|INFO|Setting lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 down in Southbound Feb 20 04:51:56 localhost nova_compute[281288]: 2026-02-20 09:51:56.292 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:56.304 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e7d3e1cfe9f4e4d8451c6f0b8be3a29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04bb3800-97e8-42cd-83bb-692b59d74b62, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=21b010cc-c3ff-4013-97b7-6b7eb23e47a9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:56.306 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 in datapath 09ccac50-3316-4f5e-b2ff-0e97a71903d8 unbound from our chassis#033[00m Feb 20 04:51:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:56.311 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09ccac50-3316-4f5e-b2ff-0e97a71903d8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:56 localhost nova_compute[281288]: 2026-02-20 09:51:56.311 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:56.312 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bbe344-3a26-43c6-861a-d7328fd06174]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:56.488 2 INFO neutron.agent.securitygroups_rpc [req-9bfa3730-17a6-4d3e-b4cb-835cdd225f2c req-73160dbe-971e-4219-ac30-c0c28777ca1e 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group member updated ['4439e19b-bf91-4420-aff1-6854f961fef4']#033[00m Feb 20 04:51:56 localhost openstack_network_exporter[244414]: ERROR 09:51:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:51:56 localhost openstack_network_exporter[244414]: Feb 20 04:51:56 localhost openstack_network_exporter[244414]: ERROR 09:51:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:51:56 localhost openstack_network_exporter[244414]: Feb 20 04:51:56 localhost nova_compute[281288]: 2026-02-20 09:51:56.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:56 localhost nova_compute[281288]: 2026-02-20 09:51:56.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:51:56 localhost nova_compute[281288]: 2026-02-20 09:51:56.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.096 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.097 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.097 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.098 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.368 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:57 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:57.444 2 INFO neutron.agent.securitygroups_rpc [None req-426d7c59-43bb-4b5f-98f0-2945e94d9430 ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']#033[00m Feb 20 04:51:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e107 e107: 6 total, 6 up, 6 in Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.912 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:51:57 localhost podman[309539]: 2026-02-20 09:51:57.915746371 +0000 UTC m=+0.061414265 container kill d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:57 localhost dnsmasq[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/addn_hosts - 0 addresses Feb 20 04:51:57 localhost dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/host Feb 20 04:51:57 localhost dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/opts Feb 20 04:51:57 localhost systemd[1]: tmp-crun.eTQV1A.mount: Deactivated successfully. Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.961 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.962 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:51:57 localhost nova_compute[281288]: 2026-02-20 09:51:57.963 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:51:58 localhost ovn_controller[156798]: 2026-02-20T09:51:58Z|00123|binding|INFO|Removing iface tap22bf7523-8a ovn-installed in OVS Feb 20 04:51:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:58.335 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fc126e7a-67b5-4025-9da6-7c8301672033 with type ""#033[00m Feb 20 04:51:58 localhost ovn_controller[156798]: 2026-02-20T09:51:58Z|00124|binding|INFO|Removing lport 22bf7523-8a19-46b0-a0b7-53070ea1823e ovn-installed in OVS Feb 20 04:51:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:58.337 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=22bf7523-8a19-46b0-a0b7-53070ea1823e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.337 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:58.339 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 22bf7523-8a19-46b0-a0b7-53070ea1823e in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc unbound from our chassis#033[00m Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.344 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:58.343 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9021dc49-7e01-42e7-8f32-572dec89afcc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:51:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:51:58.345 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c5273751-0973-4044-bc7e-cffbdd344cd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:51:58 localhost podman[309578]: 2026-02-20 09:51:58.376430278 +0000 UTC m=+0.064273020 container kill d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:51:58 localhost dnsmasq[307844]: exiting on receipt of SIGTERM Feb 20 04:51:58 localhost systemd[1]: libpod-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635.scope: Deactivated successfully. Feb 20 04:51:58 localhost podman[309592]: 2026-02-20 09:51:58.453736656 +0000 UTC m=+0.061633842 container died d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:51:58 localhost podman[309592]: 2026-02-20 09:51:58.483482814 +0000 UTC m=+0.091379950 container cleanup d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:51:58 localhost systemd[1]: libpod-conmon-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635.scope: Deactivated successfully. Feb 20 04:51:58 localhost podman[309599]: 2026-02-20 09:51:58.524698345 +0000 UTC m=+0.120383605 container remove d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:51:58 localhost kernel: device tap22bf7523-8a left promiscuous mode Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.579 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:58.616 264355 INFO neutron.agent.dhcp.agent [None req-a7409a55-f09d-478f-9f60-9dd13371890d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:51:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:51:58.617 264355 INFO neutron.agent.dhcp.agent [None req-a7409a55-f09d-478f-9f60-9dd13371890d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:51:58 localhost ovn_controller[156798]: 2026-02-20T09:51:58Z|00125|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.733 281292 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.734 281292 INFO nova.compute.manager [-] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Stopped (Lifecycle Event)#033[00m Feb 20 04:51:58 localhost nova_compute[281288]: 2026-02-20 09:51:58.751 281292 DEBUG nova.compute.manager [None req-6ef98482-7fb0-450d-9035-039a093d5d7b - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 20 04:51:58 localhost systemd[1]: var-lib-containers-storage-overlay-faa9e2373e9edef47d940be28280fa18e93a1d836a7561ac2b42ed8a739e240e-merged.mount: Deactivated successfully. Feb 20 04:51:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635-userdata-shm.mount: Deactivated successfully. Feb 20 04:51:58 localhost systemd[1]: run-netns-qdhcp\x2d9021dc49\x2d7e01\x2d42e7\x2d8f32\x2d572dec89afcc.mount: Deactivated successfully. Feb 20 04:51:59 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:59.074 2 INFO neutron.agent.securitygroups_rpc [None req-bcd9a2b7-ab94-49ae-b942-9c3b757c3657 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']#033[00m Feb 20 04:51:59 localhost neutron_sriov_agent[257177]: 2026-02-20 09:51:59.517 2 INFO neutron.agent.securitygroups_rpc [None req-da379379-3275-471e-8ade-92d9716364d1 ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']#033[00m Feb 20 04:51:59 localhost podman[309641]: 2026-02-20 09:51:59.795110469 +0000 UTC m=+0.057762255 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 20 04:51:59 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 1 addresses Feb 20 04:51:59 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:51:59 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:51:59 localhost podman[309653]: 2026-02-20 09:51:59.918943456 +0000 UTC m=+0.097311876 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:52:00 localhost nova_compute[281288]: 2026-02-20 09:52:00.415 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:00 localhost podman[309653]: 2026-02-20 09:52:00.459080645 +0000 UTC m=+0.637449125 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:52:00 localhost systemd[1]: tmp-crun.plNHWW.mount: Deactivated successfully. Feb 20 04:52:00 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:52:00 localhost podman[309654]: 2026-02-20 09:52:00.487771072 +0000 UTC m=+0.659926096 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:52:00 localhost podman[309654]: 2026-02-20 09:52:00.491151263 +0000 UTC m=+0.663306277 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 20 04:52:00 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:52:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:00 localhost nova_compute[281288]: 2026-02-20 09:52:00.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:00 localhost nova_compute[281288]: 2026-02-20 09:52:00.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:52:02 localhost nova_compute[281288]: 2026-02-20 09:52:02.414 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:02 localhost ovn_controller[156798]: 2026-02-20T09:52:02Z|00126|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:02 localhost nova_compute[281288]: 2026-02-20 09:52:02.554 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:02 localhost podman[309721]: 2026-02-20 09:52:02.733955662 +0000 UTC m=+0.060992273 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:52:02 localhost dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 0 addresses Feb 20 04:52:02 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host Feb 20 04:52:02 localhost dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts Feb 20 04:52:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:03.208 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:52:03 localhost ovn_controller[156798]: 2026-02-20T09:52:03Z|00127|binding|INFO|Releasing lport 9906e141-c388-453f-9169-7c98a351db5e from this chassis (sb_readonly=0) Feb 20 04:52:03 localhost kernel: device tap9906e141-c3 left promiscuous mode Feb 20 04:52:03 localhost nova_compute[281288]: 2026-02-20 09:52:03.278 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:03 localhost ovn_controller[156798]: 2026-02-20T09:52:03Z|00128|binding|INFO|Setting lport 9906e141-c388-453f-9169-7c98a351db5e down in Southbound Feb 20 04:52:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:03.289 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9906e141-c388-453f-9169-7c98a351db5e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:52:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:03.291 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 9906e141-c388-453f-9169-7c98a351db5e in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 unbound from our chassis#033[00m Feb 20 04:52:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:03.295 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:52:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:03.296 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d759f397-0a33-40ad-b0c1-57a983e190b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:52:03 localhost nova_compute[281288]: 2026-02-20 09:52:03.309 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e108 e108: 6 total, 6 up, 6 in Feb 20 04:52:03 localhost dnsmasq[309024]: exiting on receipt of SIGTERM Feb 20 04:52:03 localhost podman[309761]: 2026-02-20 09:52:03.718844351 +0000 UTC m=+0.045560071 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:52:03 localhost systemd[1]: libpod-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f.scope: Deactivated successfully. Feb 20 04:52:03 localhost podman[309776]: 2026-02-20 09:52:03.822254389 +0000 UTC m=+0.086568796 container died 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 20 04:52:03 localhost systemd[1]: tmp-crun.TqJWhT.mount: Deactivated successfully. Feb 20 04:52:03 localhost podman[309776]: 2026-02-20 09:52:03.846366579 +0000 UTC m=+0.110680976 container cleanup 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:52:03 localhost systemd[1]: libpod-conmon-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f.scope: Deactivated successfully. Feb 20 04:52:03 localhost podman[309775]: 2026-02-20 09:52:03.88357189 +0000 UTC m=+0.141552198 container remove 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:52:04 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:04.142 264355 INFO neutron.agent.dhcp.agent [None req-199b7a8b-74a0-461c-94c6-8bf3935123a5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:04 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:04.623 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:04 localhost ovn_controller[156798]: 2026-02-20T09:52:04Z|00129|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:04 localhost systemd[1]: var-lib-containers-storage-overlay-251cfd009f8c3881b0902462fc67fb4f6f911df6e30e6a2f4b7d252da92d4521-merged.mount: Deactivated successfully. Feb 20 04:52:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f-userdata-shm.mount: Deactivated successfully. Feb 20 04:52:04 localhost systemd[1]: run-netns-qdhcp\x2d71b28781\x2d95be\x2d4ab4\x2d86ca\x2d7c852dd117aa.mount: Deactivated successfully. Feb 20 04:52:04 localhost nova_compute[281288]: 2026-02-20 09:52:04.890 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:05 localhost nova_compute[281288]: 2026-02-20 09:52:05.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:06.015 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:52:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:06.016 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:52:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:06.016 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:52:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:07.414 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:07 localhost nova_compute[281288]: 2026-02-20 09:52:07.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 e109: 6 total, 6 up, 6 in Feb 20 04:52:07 localhost dnsmasq[308755]: exiting on receipt of SIGTERM Feb 20 04:52:07 localhost podman[309822]: 2026-02-20 09:52:07.998740857 +0000 UTC m=+0.062542969 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:07 localhost systemd[1]: libpod-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19.scope: Deactivated successfully. Feb 20 04:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:52:08 localhost podman[309836]: 2026-02-20 09:52:08.058327276 +0000 UTC m=+0.048240771 container died aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:52:08 localhost systemd[1]: tmp-crun.bKKvT6.mount: Deactivated successfully. Feb 20 04:52:08 localhost podman[309836]: 2026-02-20 09:52:08.09631628 +0000 UTC m=+0.086229735 container cleanup aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:08 localhost systemd[1]: libpod-conmon-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19.scope: Deactivated successfully. Feb 20 04:52:08 localhost podman[309843]: 2026-02-20 09:52:08.104239377 +0000 UTC m=+0.080032590 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:08 localhost podman[309843]: 2026-02-20 09:52:08.118972858 +0000 UTC m=+0.094766061 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true) Feb 20 04:52:08 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:52:08 localhost podman[309844]: 2026-02-20 09:52:08.17062936 +0000 UTC m=+0.140270120 container remove aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:52:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:08.601 264355 INFO neutron.agent.dhcp.agent [None req-c27b73df-3cdd-4605-8878-1cda769fd03b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:08.721 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:08.937 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:08 localhost systemd[1]: var-lib-containers-storage-overlay-422ec30970010cad66130b89158fe73344869aca88877d7c2bd64592bab9a6ec-merged.mount: Deactivated successfully. Feb 20 04:52:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19-userdata-shm.mount: Deactivated successfully. Feb 20 04:52:08 localhost systemd[1]: run-netns-qdhcp\x2d09ccac50\x2d3316\x2d4f5e\x2db2ff\x2d0e97a71903d8.mount: Deactivated successfully. Feb 20 04:52:10 localhost nova_compute[281288]: 2026-02-20 09:52:10.471 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:11 localhost ovn_controller[156798]: 2026-02-20T09:52:11Z|00130|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:11 localhost nova_compute[281288]: 2026-02-20 09:52:11.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:11 localhost sshd[309877]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:52:12 localhost ovn_controller[156798]: 2026-02-20T09:52:11Z|00131|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:12 localhost nova_compute[281288]: 2026-02-20 09:52:12.040 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:12 localhost nova_compute[281288]: 2026-02-20 09:52:12.460 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:12 localhost podman[309896]: 2026-02-20 09:52:12.719052755 +0000 UTC m=+0.040682566 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Feb 20 04:52:12 localhost dnsmasq[307248]: exiting on receipt of SIGTERM Feb 20 04:52:12 localhost systemd[1]: libpod-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7.scope: Deactivated successfully. Feb 20 04:52:12 localhost podman[309910]: 2026-02-20 09:52:12.784625413 +0000 UTC m=+0.051374765 container died a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:52:12 localhost systemd[1]: tmp-crun.qomzvQ.mount: Deactivated successfully. Feb 20 04:52:12 localhost podman[309910]: 2026-02-20 09:52:12.822756792 +0000 UTC m=+0.089506104 container cleanup a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:52:12 localhost systemd[1]: libpod-conmon-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7.scope: Deactivated successfully. Feb 20 04:52:12 localhost podman[309911]: 2026-02-20 09:52:12.876113825 +0000 UTC m=+0.137705843 container remove a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:52:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:12.900 264355 INFO neutron.agent.dhcp.agent [None req-c0560a25-3a64-44e8-b1f8-05a9e7a24113 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:13.212 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:13 localhost systemd[1]: var-lib-containers-storage-overlay-4b7f60ded7c720c0eef64273aac4a3860430b8d407f2c1fb2fc8c0fb6a9fb560-merged.mount: Deactivated successfully. Feb 20 04:52:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7-userdata-shm.mount: Deactivated successfully. Feb 20 04:52:13 localhost systemd[1]: run-netns-qdhcp\x2d51f8ae9c\x2d1ccc\x2d4ec5\x2d8a06\x2d5c7802ad29e0.mount: Deactivated successfully. Feb 20 04:52:15 localhost nova_compute[281288]: 2026-02-20 09:52:15.474 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:15 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:15.939 2 INFO neutron.agent.securitygroups_rpc [None req-343265ee-aef8-4c0b-8b69-d5c79e80995b ad3bee90b7c843958ab29e9ae5697cd5 78fdd34f107b4ec7ac81795ecc3f677c - - default default] Security group member updated ['7f2f6730-5897-423d-9b80-6a0cf94c3a8f']#033[00m Feb 20 04:52:17 localhost ovn_controller[156798]: 2026-02-20T09:52:17Z|00132|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:17 localhost nova_compute[281288]: 2026-02-20 09:52:17.139 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:17 localhost nova_compute[281288]: 2026-02-20 09:52:17.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:17 localhost podman[241968]: time="2026-02-20T09:52:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:52:17 localhost podman[241968]: @ - - [20/Feb/2026:09:52:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:52:17 localhost podman[241968]: @ - - [20/Feb/2026:09:52:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18329 "" "Go-http-client/1.1" Feb 20 04:52:17 localhost nova_compute[281288]: 2026-02-20 09:52:17.828 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:17 localhost ovn_controller[156798]: 2026-02-20T09:52:17Z|00133|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:17 localhost nova_compute[281288]: 2026-02-20 09:52:17.948 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:52:18 localhost systemd[1]: tmp-crun.0Smlan.mount: Deactivated successfully. Feb 20 04:52:18 localhost podman[309938]: 2026-02-20 09:52:18.156137666 +0000 UTC m=+0.086087381 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:52:18 localhost podman[309938]: 2026-02-20 09:52:18.173703881 +0000 UTC m=+0.103653576 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:52:18 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:52:18 localhost sshd[309961]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:52:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:19.519 2 INFO neutron.agent.securitygroups_rpc [None req-e2d6b938-f6f5-4317-a8d2-0776bdf5afe2 ad3bee90b7c843958ab29e9ae5697cd5 78fdd34f107b4ec7ac81795ecc3f677c - - default default] Security group member updated ['7f2f6730-5897-423d-9b80-6a0cf94c3a8f']#033[00m Feb 20 04:52:20 localhost nova_compute[281288]: 2026-02-20 09:52:20.477 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:22 localhost nova_compute[281288]: 2026-02-20 09:52:22.502 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:23 localhost nova_compute[281288]: 2026-02-20 09:52:23.031 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:52:24 localhost systemd[1]: tmp-crun.e8MLG2.mount: Deactivated successfully. Feb 20 04:52:24 localhost podman[309963]: 2026-02-20 09:52:24.165135314 +0000 UTC m=+0.095715529 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:52:24 localhost podman[309963]: 2026-02-20 09:52:24.172439492 +0000 UTC m=+0.103019697 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:52:24 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:52:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:52:25 localhost podman[309986]: 2026-02-20 09:52:25.134941583 +0000 UTC m=+0.071029883 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, version=9.7, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z) Feb 20 04:52:25 localhost podman[309986]: 2026-02-20 09:52:25.152068524 +0000 UTC m=+0.088156814 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container) Feb 20 04:52:25 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:52:25 localhost nova_compute[281288]: 2026-02-20 09:52:25.523 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:26 localhost openstack_network_exporter[244414]: ERROR 09:52:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:52:26 localhost openstack_network_exporter[244414]: Feb 20 04:52:26 localhost openstack_network_exporter[244414]: ERROR 09:52:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:52:26 localhost openstack_network_exporter[244414]: Feb 20 04:52:27 localhost nova_compute[281288]: 2026-02-20 09:52:27.506 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:27 localhost ovn_controller[156798]: 2026-02-20T09:52:27Z|00134|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:28 localhost nova_compute[281288]: 2026-02-20 09:52:28.000 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:28 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:28.824 2 INFO neutron.agent.securitygroups_rpc [req-b5848cd8-466b-4ef0-b46b-f454b04eeec2 req-730cd1ba-0675-45ee-8c23-360f67ec8632 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group member updated ['4439e19b-bf91-4420-aff1-6854f961fef4']#033[00m Feb 20 04:52:29 localhost sshd[310006]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:52:30 localhost nova_compute[281288]: 2026-02-20 09:52:30.529 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:52:30 localhost podman[310008]: 2026-02-20 09:52:30.881139094 +0000 UTC m=+0.093954367 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 04:52:30 localhost podman[310009]: 2026-02-20 09:52:30.949857875 +0000 UTC m=+0.160174073 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_managed=true) Feb 20 04:52:30 localhost podman[310008]: 2026-02-20 09:52:30.979206162 +0000 UTC m=+0.192021465 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller) Feb 20 04:52:30 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:52:31 localhost podman[310009]: 2026-02-20 09:52:31.035200984 +0000 UTC m=+0.245517192 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:52:31 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:52:32 localhost nova_compute[281288]: 2026-02-20 09:52:32.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:32 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:32.577 264355 INFO neutron.agent.linux.ip_lib [None req-8dac0a0d-a09c-464f-a6ac-4c368d2bba8b - - - - - -] Device tap7fc1435c-10 cannot be used as it has no MAC address#033[00m Feb 20 04:52:32 localhost nova_compute[281288]: 2026-02-20 09:52:32.603 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:32 localhost kernel: device tap7fc1435c-10 entered promiscuous mode Feb 20 04:52:32 localhost NetworkManager[5988]: [1771581152.6128] manager: (tap7fc1435c-10): new Generic device (/org/freedesktop/NetworkManager/Devices/26) Feb 20 04:52:32 localhost ovn_controller[156798]: 2026-02-20T09:52:32Z|00135|binding|INFO|Claiming lport 7fc1435c-10da-4551-8776-a30225c1584b for this chassis. Feb 20 04:52:32 localhost nova_compute[281288]: 2026-02-20 09:52:32.614 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:32 localhost systemd-udevd[310061]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:52:32 localhost ovn_controller[156798]: 2026-02-20T09:52:32Z|00136|binding|INFO|7fc1435c-10da-4551-8776-a30225c1584b: Claiming unknown Feb 20 04:52:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:32.631 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1bfffabed04c6d8fc33cdd0ddf56a4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46ce4a9-c894-4089-9b26-e586ca861a84, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7fc1435c-10da-4551-8776-a30225c1584b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:52:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:32.632 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fc1435c-10da-4551-8776-a30225c1584b in datapath d260fabb-b595-4411-92db-b47a732060f6 bound to our chassis#033[00m Feb 20 04:52:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:32.634 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d260fabb-b595-4411-92db-b47a732060f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:52:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:32.635 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[668a0102-af41-4b42-ae5d-5732297e91dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost ovn_controller[156798]: 2026-02-20T09:52:32Z|00137|binding|INFO|Setting lport 7fc1435c-10da-4551-8776-a30225c1584b ovn-installed in OVS Feb 20 04:52:32 localhost ovn_controller[156798]: 2026-02-20T09:52:32Z|00138|binding|INFO|Setting lport 7fc1435c-10da-4551-8776-a30225c1584b up in Southbound Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost nova_compute[281288]: 2026-02-20 09:52:32.656 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost journal[229984]: ethtool ioctl error on tap7fc1435c-10: No such device Feb 20 04:52:32 localhost nova_compute[281288]: 2026-02-20 09:52:32.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:32 localhost nova_compute[281288]: 2026-02-20 09:52:32.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:33 localhost podman[310134]: Feb 20 04:52:33 localhost podman[310134]: 2026-02-20 09:52:33.604581165 +0000 UTC m=+0.094194164 container create ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 20 04:52:33 localhost systemd[1]: Started libpod-conmon-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331.scope. Feb 20 04:52:33 localhost podman[310134]: 2026-02-20 09:52:33.559788848 +0000 UTC m=+0.049401887 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:52:33 localhost systemd[1]: Started libcrun container. Feb 20 04:52:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92bd17572daf82e995501baa24d56fe0cc42bf636ed181dc5240c8cecc9c8bc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:52:33 localhost podman[310134]: 2026-02-20 09:52:33.679368609 +0000 UTC m=+0.168981608 container init ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:52:33 localhost podman[310134]: 2026-02-20 09:52:33.692780668 +0000 UTC m=+0.182393637 container start ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:52:33 localhost dnsmasq[310152]: started, version 2.85 cachesize 150 Feb 20 04:52:33 localhost dnsmasq[310152]: DNS service limited to local subnets Feb 20 04:52:33 localhost dnsmasq[310152]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:52:33 localhost dnsmasq[310152]: warning: no upstream servers configured Feb 20 04:52:33 localhost dnsmasq-dhcp[310152]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:52:33 localhost dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 0 addresses Feb 20 04:52:33 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host Feb 20 04:52:33 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts Feb 20 04:52:33 localhost ovn_controller[156798]: 2026-02-20T09:52:33Z|00139|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:33 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:33.905 264355 INFO neutron.agent.dhcp.agent [None req-dc7fdb6d-8eae-4ab0-83a1-3cd0980d6458 - - - - - -] DHCP configuration for ports {'abe0ba2a-c493-425d-8827-68be9f0f0a81'} is completed#033[00m Feb 20 04:52:33 localhost nova_compute[281288]: 2026-02-20 09:52:33.914 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:35 localhost nova_compute[281288]: 2026-02-20 09:52:35.564 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:35 localhost nova_compute[281288]: 2026-02-20 09:52:35.673 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:52:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:52:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:36.617 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:36Z, description=, device_id=3095f6e8-d4c1-4b47-b904-07c6a9deaaf2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e004d3c-491a-4ae0-ad9a-f29043bf90a7, ip_allocation=immediate, mac_address=fa:16:3e:2b:6a:50, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:30Z, description=, dns_domain=, id=d260fabb-b595-4411-92db-b47a732060f6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-1855742561-network, port_security_enabled=True, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41604, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=834, status=ACTIVE, subnets=['663a7f2b-e539-4525-8f2f-a5461c1df7da'], tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:31Z, vlan_transparent=None, network_id=d260fabb-b595-4411-92db-b47a732060f6, port_security_enabled=False, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=864, status=DOWN, tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:36Z on network d260fabb-b595-4411-92db-b47a732060f6#033[00m Feb 20 04:52:36 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:36.824 2 INFO neutron.agent.securitygroups_rpc [None req-d63b0875-b3f1-4849-b165-16313644e666 eab28fccca6a48139a7d8b395d8f0b9a dc182b0a7cbb4e47b6b88befc2c48022 - - default default] Security group member updated ['8d0cb685-1e0f-43aa-973a-a081d9962496']#033[00m Feb 20 04:52:36 localhost dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 1 addresses Feb 20 04:52:36 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host Feb 20 04:52:36 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts Feb 20 04:52:36 localhost podman[310255]: 2026-02-20 09:52:36.826746209 +0000 UTC m=+0.047926802 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:37.081 264355 INFO neutron.agent.dhcp.agent [None req-3b46eb09-90dd-425c-a00e-51b5f307d183 - - - - - -] DHCP configuration for ports {'5e004d3c-491a-4ae0-ad9a-f29043bf90a7'} is completed#033[00m Feb 20 04:52:37 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:37.385 2 INFO neutron.agent.securitygroups_rpc [None req-51ac2042-ec94-4975-95ca-42a72712c92b eab28fccca6a48139a7d8b395d8f0b9a dc182b0a7cbb4e47b6b88befc2c48022 - - default default] Security group member updated ['8d0cb685-1e0f-43aa-973a-a081d9962496']#033[00m Feb 20 04:52:37 localhost nova_compute[281288]: 2026-02-20 09:52:37.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:37.704 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:36Z, description=, device_id=3095f6e8-d4c1-4b47-b904-07c6a9deaaf2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e004d3c-491a-4ae0-ad9a-f29043bf90a7, ip_allocation=immediate, mac_address=fa:16:3e:2b:6a:50, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:30Z, description=, dns_domain=, id=d260fabb-b595-4411-92db-b47a732060f6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-1855742561-network, port_security_enabled=True, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41604, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=834, status=ACTIVE, subnets=['663a7f2b-e539-4525-8f2f-a5461c1df7da'], tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:31Z, vlan_transparent=None, network_id=d260fabb-b595-4411-92db-b47a732060f6, port_security_enabled=False, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=864, status=DOWN, tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:36Z on network d260fabb-b595-4411-92db-b47a732060f6#033[00m Feb 20 04:52:37 localhost dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 1 addresses Feb 20 04:52:37 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host Feb 20 04:52:37 localhost systemd[1]: tmp-crun.m6f0tc.mount: Deactivated successfully. Feb 20 04:52:37 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts Feb 20 04:52:37 localhost podman[310292]: 2026-02-20 09:52:37.95235449 +0000 UTC m=+0.065277840 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:52:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:38.232 264355 INFO neutron.agent.dhcp.agent [None req-e378f723-20c9-4d9f-881b-1e24f8f086d3 - - - - - -] DHCP configuration for ports {'5e004d3c-491a-4ae0-ad9a-f29043bf90a7'} is completed#033[00m Feb 20 04:52:38 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:52:39 localhost podman[310312]: 2026-02-20 09:52:39.148120116 +0000 UTC m=+0.079418344 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:52:39 localhost podman[310312]: 2026-02-20 09:52:39.158930368 +0000 UTC m=+0.090228586 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute) Feb 20 04:52:39 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:52:40 localhost nova_compute[281288]: 2026-02-20 09:52:40.595 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:41.673 264355 INFO neutron.agent.linux.ip_lib [None req-ef006bf0-d1fe-4738-a151-08769a2e9194 - - - - - -] Device tape04a3d27-d2 cannot be used as it has no MAC address#033[00m Feb 20 04:52:41 localhost nova_compute[281288]: 2026-02-20 09:52:41.736 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:41 localhost kernel: device tape04a3d27-d2 entered promiscuous mode Feb 20 04:52:41 localhost NetworkManager[5988]: [1771581161.7466] manager: (tape04a3d27-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Feb 20 04:52:41 localhost ovn_controller[156798]: 2026-02-20T09:52:41Z|00140|binding|INFO|Claiming lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab for this chassis. Feb 20 04:52:41 localhost ovn_controller[156798]: 2026-02-20T09:52:41Z|00141|binding|INFO|e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab: Claiming unknown Feb 20 04:52:41 localhost nova_compute[281288]: 2026-02-20 09:52:41.753 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:41 localhost systemd-udevd[310351]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:52:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:41.764 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2039424f830e4ef5aa461223cac1ffd5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=32672433-7024-441f-825a-7707135603bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:52:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:41.767 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab in datapath 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9 bound to our chassis#033[00m Feb 20 04:52:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:41.774 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port d954753b-8f76-43d0-95a6-a39aa6a0330d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:52:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:41.774 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:52:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:41.775 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8d413acf-5040-48af-ab55-74175dde4b3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost ovn_controller[156798]: 2026-02-20T09:52:41Z|00142|binding|INFO|Setting lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab ovn-installed in OVS Feb 20 04:52:41 localhost ovn_controller[156798]: 2026-02-20T09:52:41Z|00143|binding|INFO|Setting lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab up in Southbound Feb 20 04:52:41 localhost nova_compute[281288]: 2026-02-20 09:52:41.806 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost journal[229984]: ethtool ioctl error on tape04a3d27-d2: No such device Feb 20 04:52:41 localhost nova_compute[281288]: 2026-02-20 09:52:41.845 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:41 localhost nova_compute[281288]: 2026-02-20 09:52:41.876 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:41 localhost podman[310383]: 2026-02-20 09:52:41.922796516 +0000 UTC m=+0.059965281 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 20 04:52:41 localhost dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 0 addresses Feb 20 04:52:41 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host Feb 20 04:52:41 localhost dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts Feb 20 04:52:41 localhost systemd[1]: tmp-crun.y6SinQ.mount: Deactivated successfully. Feb 20 04:52:42 localhost ovn_controller[156798]: 2026-02-20T09:52:42Z|00144|binding|INFO|Releasing lport 7fc1435c-10da-4551-8776-a30225c1584b from this chassis (sb_readonly=0) Feb 20 04:52:42 localhost kernel: device tap7fc1435c-10 left promiscuous mode Feb 20 04:52:42 localhost nova_compute[281288]: 2026-02-20 09:52:42.088 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:42 localhost ovn_controller[156798]: 2026-02-20T09:52:42Z|00145|binding|INFO|Setting lport 7fc1435c-10da-4551-8776-a30225c1584b down in Southbound Feb 20 04:52:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:42.099 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1bfffabed04c6d8fc33cdd0ddf56a4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46ce4a9-c894-4089-9b26-e586ca861a84, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7fc1435c-10da-4551-8776-a30225c1584b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:52:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:42.100 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fc1435c-10da-4551-8776-a30225c1584b in datapath d260fabb-b595-4411-92db-b47a732060f6 unbound from our chassis#033[00m Feb 20 04:52:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:42.101 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d260fabb-b595-4411-92db-b47a732060f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:52:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:42.102 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5720a484-671e-4a17-b479-35da2881524b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:52:42 localhost nova_compute[281288]: 2026-02-20 09:52:42.108 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:42 localhost nova_compute[281288]: 2026-02-20 09:52:42.110 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:42 localhost nova_compute[281288]: 2026-02-20 09:52:42.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.588464) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162588508, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1126, "num_deletes": 253, "total_data_size": 1367012, "memory_usage": 1395672, "flush_reason": "Manual Compaction"} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162595542, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 646394, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17058, "largest_seqno": 18179, "table_properties": {"data_size": 642460, "index_size": 1597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10892, "raw_average_key_size": 21, "raw_value_size": 633788, "raw_average_value_size": 1245, "num_data_blocks": 71, "num_entries": 509, "num_filter_entries": 509, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581095, "oldest_key_time": 1771581095, "file_creation_time": 1771581162, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 7127 microseconds, and 3400 cpu microseconds. Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.595591) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 646394 bytes OK Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.595615) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.597672) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.597697) EVENT_LOG_v1 {"time_micros": 1771581162597690, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.597717) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1361433, prev total WAL file size 1361757, number of live WAL files 2. Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.599669) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373536' seq:72057594037927935, type:22 .. '6D6772737461740034303037' seq:0, type:0; will stop at (end) Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(631KB)], [21(18MB)] Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162599716, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19536903, "oldest_snapshot_seqno": -1} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12299 keys, 17598600 bytes, temperature: kUnknown Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162681755, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 17598600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17530187, "index_size": 36568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30789, "raw_key_size": 328436, "raw_average_key_size": 26, "raw_value_size": 17322493, "raw_average_value_size": 1408, "num_data_blocks": 1390, "num_entries": 12299, "num_filter_entries": 12299, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581162, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.682027) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 17598600 bytes Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.683693) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.9 rd, 214.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.0 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(57.5) write-amplify(27.2) OK, records in: 12798, records dropped: 499 output_compression: NoCompression Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.683715) EVENT_LOG_v1 {"time_micros": 1771581162683706, "job": 10, "event": "compaction_finished", "compaction_time_micros": 82134, "compaction_time_cpu_micros": 46366, "output_level": 6, "num_output_files": 1, "total_output_size": 17598600, "num_input_records": 12798, "num_output_records": 12299, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162683879, "job": 10, "event": "table_file_deletion", "file_number": 23} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162685719, "job": 10, "event": "table_file_deletion", "file_number": 21} Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.599564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:52:42 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:52:42 localhost podman[310451]: Feb 20 04:52:42 localhost podman[310451]: 2026-02-20 09:52:42.71761479 +0000 UTC m=+0.086050621 container create b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:52:42 localhost systemd[1]: Started libpod-conmon-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf.scope. Feb 20 04:52:42 localhost systemd[1]: Started libcrun container. Feb 20 04:52:42 localhost podman[310451]: 2026-02-20 09:52:42.675456701 +0000 UTC m=+0.043892572 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:52:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e326f6f6e12d73c52b76eeb8e2267cd2522e4c76d841349323c3f224f65b99/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:52:42 localhost podman[310451]: 2026-02-20 09:52:42.786360023 +0000 UTC m=+0.154795864 container init b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 20 04:52:42 localhost podman[310451]: 2026-02-20 09:52:42.796098413 +0000 UTC m=+0.164534254 container start b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:42 localhost dnsmasq[310469]: started, version 2.85 cachesize 150 Feb 20 04:52:42 localhost dnsmasq[310469]: DNS service limited to local subnets Feb 20 04:52:42 localhost dnsmasq[310469]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:52:42 localhost dnsmasq[310469]: warning: no upstream servers configured Feb 20 04:52:42 localhost dnsmasq-dhcp[310469]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:52:42 localhost dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 0 addresses Feb 20 04:52:42 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host Feb 20 04:52:42 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts Feb 20 04:52:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:42.892 264355 INFO neutron.agent.dhcp.agent [None req-539f1d93-9842-489c-8a64-1b88aa6e049f - - - - - -] DHCP configuration for ports {'8089ee58-1d9f-439a-ac4b-21c4ed035ba8'} is completed#033[00m Feb 20 04:52:44 localhost nova_compute[281288]: 2026-02-20 09:52:44.245 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:44 localhost ovn_controller[156798]: 2026-02-20T09:52:44Z|00146|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:45 localhost nova_compute[281288]: 2026-02-20 09:52:45.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e110 e110: 6 total, 6 up, 6 in Feb 20 04:52:45 localhost nova_compute[281288]: 2026-02-20 09:52:45.597 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:45 localhost systemd[1]: tmp-crun.txsIH4.mount: Deactivated successfully. Feb 20 04:52:45 localhost dnsmasq[310152]: exiting on receipt of SIGTERM Feb 20 04:52:45 localhost podman[310485]: 2026-02-20 09:52:45.651488702 +0000 UTC m=+0.072347176 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:45 localhost systemd[1]: libpod-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331.scope: Deactivated successfully. Feb 20 04:52:45 localhost podman[310498]: 2026-02-20 09:52:45.719392312 +0000 UTC m=+0.056835056 container died ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:52:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:45 localhost podman[310498]: 2026-02-20 09:52:45.753140517 +0000 UTC m=+0.090583261 container cleanup ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:52:45 localhost systemd[1]: libpod-conmon-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331.scope: Deactivated successfully. Feb 20 04:52:45 localhost podman[310505]: 2026-02-20 09:52:45.796325746 +0000 UTC m=+0.122251150 container remove ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:52:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.033 264355 INFO neutron.agent.dhcp.agent [None req-33bf7414-f555-433c-8361-85d7e8823947 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.174 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:46 localhost nova_compute[281288]: 2026-02-20 09:52:46.497 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:46 localhost systemd[1]: var-lib-containers-storage-overlay-92bd17572daf82e995501baa24d56fe0cc42bf636ed181dc5240c8cecc9c8bc8-merged.mount: Deactivated successfully. Feb 20 04:52:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331-userdata-shm.mount: Deactivated successfully. Feb 20 04:52:46 localhost systemd[1]: run-netns-qdhcp\x2dd260fabb\x2db595\x2d4411\x2d92db\x2db47a732060f6.mount: Deactivated successfully. Feb 20 04:52:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.865 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:46Z, description=, device_id=d472f0b4-01df-4346-9239-5246395c8051, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa, ip_allocation=immediate, mac_address=fa:16:3e:9a:69:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:39Z, description=, dns_domain=, id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1575908493-network, port_security_enabled=True, project_id=2039424f830e4ef5aa461223cac1ffd5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39792, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=894, status=ACTIVE, subnets=['ddb79cd2-81f4-40d9-9ce0-203e7af8c023'], tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:39Z, vlan_transparent=None, network_id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, port_security_enabled=False, project_id=2039424f830e4ef5aa461223cac1ffd5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=953, status=DOWN, tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:46Z on network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9#033[00m Feb 20 04:52:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.997 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:47 localhost systemd[1]: tmp-crun.X8E8QB.mount: Deactivated successfully. Feb 20 04:52:47 localhost dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 1 addresses Feb 20 04:52:47 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host Feb 20 04:52:47 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts Feb 20 04:52:47 localhost podman[310544]: 2026-02-20 09:52:47.09839958 +0000 UTC m=+0.076254395 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:52:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e111 e111: 6 total, 6 up, 6 in Feb 20 04:52:47 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:47.375 264355 INFO neutron.agent.dhcp.agent [None req-bdd6a5c4-2441-4f1e-9251-d014d8172b09 - - - - - -] DHCP configuration for ports {'ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa'} is completed#033[00m Feb 20 04:52:47 localhost nova_compute[281288]: 2026-02-20 09:52:47.622 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:47 localhost podman[241968]: time="2026-02-20T09:52:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:52:47 localhost podman[241968]: @ - - [20/Feb/2026:09:52:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157080 "" "Go-http-client/1.1" Feb 20 04:52:47 localhost podman[241968]: @ - - [20/Feb/2026:09:52:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18804 "" "Go-http-client/1.1" Feb 20 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:52:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:49.092 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:46Z, description=, device_id=d472f0b4-01df-4346-9239-5246395c8051, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa, ip_allocation=immediate, mac_address=fa:16:3e:9a:69:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:39Z, description=, dns_domain=, id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1575908493-network, port_security_enabled=True, project_id=2039424f830e4ef5aa461223cac1ffd5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39792, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=894, status=ACTIVE, subnets=['ddb79cd2-81f4-40d9-9ce0-203e7af8c023'], tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:39Z, vlan_transparent=None, network_id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, port_security_enabled=False, project_id=2039424f830e4ef5aa461223cac1ffd5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=953, status=DOWN, tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:46Z on network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9#033[00m Feb 20 04:52:49 localhost systemd[1]: tmp-crun.MoB5l7.mount: Deactivated successfully. Feb 20 04:52:49 localhost podman[310564]: 2026-02-20 09:52:49.170364201 +0000 UTC m=+0.104768350 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:52:49 localhost podman[310564]: 2026-02-20 09:52:49.18318424 +0000 UTC m=+0.117588369 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:52:49 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:52:49 localhost dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 1 addresses Feb 20 04:52:49 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host Feb 20 04:52:49 localhost podman[310606]: 2026-02-20 09:52:49.315729031 +0000 UTC m=+0.061234029 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:52:49 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts Feb 20 04:52:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:49.622 264355 INFO neutron.agent.dhcp.agent [None req-95e51d7c-eec2-451f-bfd2-388d45597fda - - - - - -] DHCP configuration for ports {'ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa'} is completed#033[00m Feb 20 04:52:49 localhost nova_compute[281288]: 2026-02-20 09:52:49.925 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:50 localhost nova_compute[281288]: 2026-02-20 09:52:50.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:51 localhost nova_compute[281288]: 2026-02-20 09:52:51.524 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 e112: 6 total, 6 up, 6 in Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.747 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:52:52 localhost nova_compute[281288]: 2026-02-20 09:52:52.747 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:52:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:52:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2540140338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.207 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.274 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.495 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.497 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11366MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.497 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.498 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.602 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:52:53 localhost nova_compute[281288]: 2026-02-20 09:52:53.638 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:52:54 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:52:54 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2299012886' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.154 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.158 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.164 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.178 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.181 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.181 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:52:54 localhost dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 0 addresses Feb 20 04:52:54 localhost podman[310687]: 2026-02-20 09:52:54.227921021 +0000 UTC m=+0.044613124 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:52:54 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host Feb 20 04:52:54 localhost dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts Feb 20 04:52:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:52:54 localhost systemd[1]: tmp-crun.2MMR78.mount: Deactivated successfully. Feb 20 04:52:54 localhost systemd[1]: tmp-crun.hSJIzO.mount: Deactivated successfully. Feb 20 04:52:54 localhost podman[310700]: 2026-02-20 09:52:54.312038463 +0000 UTC m=+0.069935372 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:52:54 localhost podman[310700]: 2026-02-20 09:52:54.320538391 +0000 UTC m=+0.078435290 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:52:54 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:52:54 localhost ovn_controller[156798]: 2026-02-20T09:52:54Z|00147|binding|INFO|Releasing lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab from this chassis (sb_readonly=0) Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.407 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:54 localhost ovn_controller[156798]: 2026-02-20T09:52:54Z|00148|binding|INFO|Setting lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab down in Southbound Feb 20 04:52:54 localhost kernel: device tape04a3d27-d2 left promiscuous mode Feb 20 04:52:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:54.415 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2039424f830e4ef5aa461223cac1ffd5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=32672433-7024-441f-825a-7707135603bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:52:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:54.416 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab in datapath 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9 unbound from our chassis#033[00m Feb 20 04:52:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:54.418 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:52:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:54.419 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[685bc5f8-ea40-4ad1-8024-976a766add66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:52:54 localhost nova_compute[281288]: 2026-02-20 09:52:54.436 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:55 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:55.210 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:55 localhost nova_compute[281288]: 2026-02-20 09:52:55.671 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:52:56 localhost sshd[310733]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:52:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:56.045 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:52:56 localhost nova_compute[281288]: 2026-02-20 09:52:56.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:52:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:52:56.048 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:52:56 localhost podman[310734]: 2026-02-20 09:52:56.144631262 +0000 UTC m=+0.084186675 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, version=9.7, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 04:52:56 localhost podman[310734]: 2026-02-20 09:52:56.158453641 +0000 UTC m=+0.098009064 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7) Feb 20 04:52:56 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:52:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:56.306 2 INFO neutron.agent.securitygroups_rpc [None req-35b4089d-d96b-4223-91a7-29363be26031 9b5edcaf5d0f48eea2ef440e3b3c2f79 85741ccf160049968710bbf0d3ed7a21 - - default default] Security group member updated ['1f0747df-ad50-4106-9a56-f1b68b2201c8']#033[00m Feb 20 04:52:56 localhost ovn_controller[156798]: 2026-02-20T09:52:56Z|00149|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:56 localhost nova_compute[281288]: 2026-02-20 09:52:56.387 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:56 localhost openstack_network_exporter[244414]: ERROR 09:52:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:52:56 localhost openstack_network_exporter[244414]: Feb 20 04:52:56 localhost openstack_network_exporter[244414]: ERROR 09:52:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:52:56 localhost openstack_network_exporter[244414]: Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.182 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.184 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.185 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.185 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.628 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:57 localhost ovn_controller[156798]: 2026-02-20T09:52:57Z|00150|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.687 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:57 localhost nova_compute[281288]: 2026-02-20 09:52:57.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:58 localhost ovn_controller[156798]: 2026-02-20T09:52:58Z|00151|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.070 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:58 localhost ovn_controller[156798]: 2026-02-20T09:52:58Z|00152|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.341 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:52:58 localhost neutron_sriov_agent[257177]: 2026-02-20 09:52:58.354 2 INFO neutron.agent.securitygroups_rpc [None req-3bf8b391-96d6-4728-ae92-83d8f7b4ba3a 9b5edcaf5d0f48eea2ef440e3b3c2f79 85741ccf160049968710bbf0d3ed7a21 - - default default] Security group member updated ['1f0747df-ad50-4106-9a56-f1b68b2201c8']#033[00m Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:52:58 localhost dnsmasq[310469]: exiting on receipt of SIGTERM Feb 20 04:52:58 localhost podman[310774]: 2026-02-20 09:52:58.801168819 +0000 UTC m=+0.060013312 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:52:58 localhost systemd[1]: libpod-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf.scope: Deactivated successfully. Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.809 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:52:58 localhost nova_compute[281288]: 2026-02-20 09:52:58.810 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:52:58 localhost podman[310789]: 2026-02-20 09:52:58.870107831 +0000 UTC m=+0.049159303 container died b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:52:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf-userdata-shm.mount: Deactivated successfully. Feb 20 04:52:58 localhost systemd[1]: var-lib-containers-storage-overlay-46e326f6f6e12d73c52b76eeb8e2267cd2522e4c76d841349323c3f224f65b99-merged.mount: Deactivated successfully. Feb 20 04:52:58 localhost podman[310789]: 2026-02-20 09:52:58.918030894 +0000 UTC m=+0.097082316 container remove b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:52:58 localhost systemd[1]: libpod-conmon-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf.scope: Deactivated successfully. Feb 20 04:52:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:58.963 264355 INFO neutron.agent.dhcp.agent [None req-f9b829a4-b8c5-467b-8847-f002c25f37b2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:58 localhost systemd[1]: run-netns-qdhcp\x2d76a56eb0\x2d2df1\x2d4460\x2da42f\x2d3d7d5c92bfc9.mount: Deactivated successfully. Feb 20 04:52:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:52:59.147 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:52:59 localhost nova_compute[281288]: 2026-02-20 09:52:59.432 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:52:59 localhost nova_compute[281288]: 2026-02-20 09:52:59.464 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:52:59 localhost nova_compute[281288]: 2026-02-20 09:52:59.465 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:53:00 localhost nova_compute[281288]: 2026-02-20 09:53:00.709 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:00 localhost nova_compute[281288]: 2026-02-20 09:53:00.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:00 localhost nova_compute[281288]: 2026-02-20 09:53:00.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:53:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:53:01 localhost podman[310815]: 2026-02-20 09:53:01.159685853 +0000 UTC m=+0.085570426 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:53:01 localhost podman[310815]: 2026-02-20 09:53:01.199071358 +0000 UTC m=+0.124955891 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:53:01 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:53:01 localhost podman[310814]: 2026-02-20 09:53:01.211011881 +0000 UTC m=+0.143634379 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:53:01 localhost podman[310814]: 2026-02-20 09:53:01.246410115 +0000 UTC m=+0.179032623 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller) Feb 20 04:53:01 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:53:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:53:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4284690376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:53:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:53:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4284690376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:53:02 localhost nova_compute[281288]: 2026-02-20 09:53:02.187 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:02 localhost nova_compute[281288]: 2026-02-20 09:53:02.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:05.051 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:53:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:05 localhost nova_compute[281288]: 2026-02-20 09:53:05.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:06 localhost sshd[310855]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:06.016 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:53:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:53:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:53:06 localhost sshd[310857]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:07 localhost nova_compute[281288]: 2026-02-20 09:53:07.679 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:53:10 localhost podman[310859]: 2026-02-20 09:53:10.156723403 +0000 UTC m=+0.094751176 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:53:10 localhost podman[310859]: 2026-02-20 09:53:10.172929945 +0000 UTC m=+0.110957728 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:53:10 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:53:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:10 localhost nova_compute[281288]: 2026-02-20 09:53:10.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:11 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:11.447 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:12 localhost sshd[310878]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:12 localhost nova_compute[281288]: 2026-02-20 09:53:12.683 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:14.368 2 INFO neutron.agent.securitygroups_rpc [None req-8db2cda0-f70c-405a-8e32-bbd09e8f7101 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']#033[00m Feb 20 04:53:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:14.445 2 INFO neutron.agent.securitygroups_rpc [None req-8db2cda0-f70c-405a-8e32-bbd09e8f7101 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']#033[00m Feb 20 04:53:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:14.933 2 INFO neutron.agent.securitygroups_rpc [None req-ea37899b-0895-4039-936c-a92dc4af71cc 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']#033[00m Feb 20 04:53:15 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:15.277 2 INFO neutron.agent.securitygroups_rpc [None req-6181ff08-47fa-4ccb-88a6-fd4810762b1a 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']#033[00m Feb 20 04:53:15 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:15.313 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:15 localhost nova_compute[281288]: 2026-02-20 09:53:15.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:16 localhost sshd[310880]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:17 localhost ovn_controller[156798]: 2026-02-20T09:53:17Z|00153|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:53:17 localhost nova_compute[281288]: 2026-02-20 09:53:17.460 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:17 localhost nova_compute[281288]: 2026-02-20 09:53:17.685 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:17 localhost podman[241968]: time="2026-02-20T09:53:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:53:17 localhost podman[241968]: @ - - [20/Feb/2026:09:53:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:53:17 localhost podman[241968]: @ - - [20/Feb/2026:09:53:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18338 "" "Go-http-client/1.1" Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.209 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.209 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.209 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.215 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85cddcab-e1ba-4038-8de8-3a88f2ce8995', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.210353', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fb9be9be-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'ec8bf3898737f735819f5db2fffcd7b1cef436b8da5111d5343b5388f0163bb2'}]}, 'timestamp': '2026-02-20 09:53:18.216140', '_unique_id': 'f9220cc9dcaf4bedbb201fd4203e388f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.219 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ea0b5f5-d6e2-440d-95b8-2a9f6776a2db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.219165', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fb9c78ca-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '0dedeb6f160609512b7914c61c4300fe74db374d331378eca39b439140ac4835'}]}, 'timestamp': '2026-02-20 09:53:18.219840', '_unique_id': '0bc49c2bea0543ccb83ee6f31a2a1bc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.254 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c2c45bc-ed93-4641-b579-1294869be61a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.222152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fba1e364-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '1bf9f50a05e1746c237d63c3c69a328e5d366319834db50c5dddfebcb68ae0a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.222152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fba1f8f4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'e42fb933d7779b7147f878dea45cb254bee717a0bd6b22f4d1a15f3263af9bd2'}]}, 'timestamp': '2026-02-20 09:53:18.255857', '_unique_id': '7f53ff5bef6246d4b5c3c730185fbfb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 17020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd8b2d02-e4c4-495a-b2b9-55f5435517c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17020000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:53:18.258530', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fba4f946-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.514018839, 'message_signature': 'df1ef83cc1684faccd8605f907629be6c001aad8c4a4fa9e8923099591f95fbd'}]}, 'timestamp': '2026-02-20 09:53:18.275443', '_unique_id': 'dbb20c1906a14f3b827193e07031ba90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.277 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.277 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ef533af-af0f-4e25-8cf8-5e338f170601', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:53:18.277874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fba56aac-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.514018839, 'message_signature': 'b721388681561c72bfe9ba88155dad8c74a2e075fe40ff63210e38aad8b6811d'}]}, 'timestamp': '2026-02-20 09:53:18.278336', '_unique_id': 'e931144974234ea29000b2386dcb38c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4fa010f-b7c4-4225-b2ae-b615a33f470d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.280705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fba5db36-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '73537861ef1090a9df195dee265a1250c6f189ac533046a907ec4ebd8897575b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.280705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fba5ec3e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'c0db6ae4c5e421cd4ff187a2d9ad3698cccd156c274aed84d9145e45c5851e77'}]}, 'timestamp': '2026-02-20 09:53:18.281667', '_unique_id': 'f22991d8cdaf47868cc7c68bf4f3a3c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd307503f-3c91-4ff8-b50e-f969d51c6198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.283990', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fba659e4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'f2b2447a9e91c2a71defcb61139c7fe11f93a740d93b7824c93483dbcc9479f9'}]}, 'timestamp': '2026-02-20 09:53:18.284473', '_unique_id': '558f5cb1a1a7448b9cf75cf5bfed0817'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5268b6d8-b743-4a71-9f8b-20d21884d5c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.287083', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fba6d266-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'b6520196223df5e0f543ef08f6afad0ead823ae41cbbdb6fabc4aa220ebef70d'}]}, 'timestamp': '2026-02-20 09:53:18.287561', '_unique_id': '9d1816720a47461cb95ec87612a97fe7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.289 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68599abd-d603-465d-8431-7afccb8e1a4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.289787', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fba73bde-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'a30b2b0f901c1c65584a8e4d93445e2080a03dedf16a9e84b2b83987400e23fe'}]}, 'timestamp': '2026-02-20 09:53:18.290255', '_unique_id': 'a8b1c1b12545456694aa54158c324b11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbb12f8a-4e29-4391-bba0-be94fe59193d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.292512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fba9730e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '02b1cdbee1e4ea7777f95c66368d3cfdc7b6152cd49bbb6cb939cada2bad08c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.292512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fba98e2a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '91fd42f9192411ae6fba0e9dbef4dae4b52a8c53d57e7a1220a62ee802c98e30'}]}, 'timestamp': '2026-02-20 09:53:18.305561', '_unique_id': '629cfc0e1f694b999d0c83f614321673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b83e493a-a202-46bc-a838-5e3f73f39da8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.309118', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbaa338e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '78a8f7da5b12390c8e030950bf4b112c70cc55a3d63ee2cffc76fcaaa83ca456'}]}, 'timestamp': '2026-02-20 09:53:18.309854', '_unique_id': '01e859984deb4cdfa5e4e963c1ecaa95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38a52775-8893-4a95-961e-af422b28c926', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.312965', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbaac95c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '77ff1392e57a5ac9d529cfcd7977f27696b89c8ab970853976924796aae37889'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.312965', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbaae2de-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': 'cb286e6a78437a62746219f0c8e5b121f0b09d1d96ad5630b4af8e1cda189f66'}]}, 'timestamp': '2026-02-20 09:53:18.314274', '_unique_id': '01a55539d2804ace84ae5fff71aacd4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dcd4fa4-3c24-4026-a756-14ce0d02655c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.317620', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbab81c6-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '1475d9c436cc2d054b481cc40e6645bd6ec8f541e3078616d314a6df1535537a'}]}, 'timestamp': '2026-02-20 09:53:18.318370', '_unique_id': 'b1e2c5016071465bb8c66f8a7fa9fd1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7349302d-2bce-4b7a-bb34-d02612d57054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.321482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbac17a8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '0e742abaf9f7d3bdd4dd13df30bbbf7770bc1a71ba3a3c38bc535a55f82bb927'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.321482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbac2ffe-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '257e2a370ced78e7b31eef770ca3c83c10f8add5c313cab02d1c418e4cc12d1f'}]}, 'timestamp': '2026-02-20 09:53:18.322812', '_unique_id': '1bcea503a3834a948383da9c0576dc68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92dafe8-51cb-47a0-aa11-faae5e7ba786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.325571', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbacb6d6-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'e634c65c5c6d84667c738ca6939bf2ac41663e55d5c11dcfe954649b787bff3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.325571', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbaccdba-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '00b82c3e770a946cf6482421293c9ee40c41c7a34d19d8cb6148f554fb550c67'}]}, 'timestamp': '2026-02-20 09:53:18.326879', '_unique_id': '9b41e0686aa94f6494818509c2ad36c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc3ffec8-432e-44a6-b28f-aae7d213a014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.329307', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbad41fa-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'b5bdd9ac208a5e9e32641e2e0922892b52ac5300f96b2ed7ca22e6d8c58e6558'}]}, 'timestamp': '2026-02-20 09:53:18.329712', '_unique_id': 'e4e93e7b88fc41f79a1e03b08035d698'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3852c963-af1f-4647-8dd0-4370211f7714', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.331519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbad98ee-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '77102fccf1a9ee2ea964e690ee698c1f53dfaf5da22173d7796aa7b6a5651b90'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.331519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbada7f8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '3f2796e8765632aa5ee0d8e3ef8176e606bed23d4b4c8a3875b357bdac4f65e5'}]}, 'timestamp': '2026-02-20 09:53:18.332293', '_unique_id': '99512252946d46829ae6f1ccb6c4befd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.334 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.334 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '414acb6d-44d8-411d-b824-ae94bed388ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.334213', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbae070c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'd8248b47590ac660e28b6345ce9ec6007b678c03ee3d60b12232df571e3367ca'}]}, 'timestamp': '2026-02-20 09:53:18.334768', '_unique_id': '43c305aca04c49358f994a2fcc7e584a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.336 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.336 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.336 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd47adfac-9b9e-4e9e-a759-817126384ebe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.336328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbae53ec-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'b245e35a6facc9bc943024e6a6b8d5877997d459152435736c6056321de71a1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.336328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbae5f40-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '5f4f25f9788d274201f0deb104015658512facd7a232eaa2279f602ec6918beb'}]}, 'timestamp': '2026-02-20 09:53:18.336928', '_unique_id': '386b57fead55499a82356e464939fecc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9360518f-f609-4a5d-a2df-31059eb52a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.338269', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbae9e92-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '4eab9125a22dabf1173bf4a27feeab46384bc1fc01e7477eed28623dde72b665'}]}, 'timestamp': '2026-02-20 09:53:18.338576', '_unique_id': 'bb7e0563df5e4f68bf3d149787426264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.340 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c22ad54-6342-43b1-bed6-538b47cbfece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.339940', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbaedf4c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '71095806b1157a0a0e36f87f03a8530def284fe994a9f0c8db2e997fc43a37c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.339940', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbaee8fc-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '3ec6a8390e73d67dffd5868fba5dce4fbb4b4008dc9d3523e4858b9eab818d11'}]}, 'timestamp': '2026-02-20 09:53:18.340453', '_unique_id': '1effd05e8ddd4b339f1a2abef4e9a899'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:53:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Feb 20 04:53:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:53:20 localhost systemd[1]: tmp-crun.xDbH2D.mount: Deactivated successfully. Feb 20 04:53:20 localhost podman[310882]: 2026-02-20 09:53:20.153393612 +0000 UTC m=+0.090496896 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:53:20 localhost podman[310882]: 2026-02-20 09:53:20.16615841 +0000 UTC m=+0.103261634 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:53:20 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:53:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:20 localhost nova_compute[281288]: 2026-02-20 09:53:20.874 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.619402) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202619444, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 752, "num_deletes": 256, "total_data_size": 682253, "memory_usage": 696848, "flush_reason": "Manual Compaction"} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202625012, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 444548, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18184, "largest_seqno": 18931, "table_properties": {"data_size": 441222, "index_size": 1181, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7843, "raw_average_key_size": 18, "raw_value_size": 434335, "raw_average_value_size": 1049, "num_data_blocks": 52, "num_entries": 414, "num_filter_entries": 414, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581162, "oldest_key_time": 1771581162, "file_creation_time": 1771581202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 5662 microseconds, and 2083 cpu microseconds. Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.625065) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 444548 bytes OK Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.625088) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.626858) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.626881) EVENT_LOG_v1 {"time_micros": 1771581202626875, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.626902) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 678233, prev total WAL file size 678233, number of live WAL files 2. Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.627563) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373731' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end) Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(434KB)], [24(16MB)] Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202628242, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 18043148, "oldest_snapshot_seqno": -1} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12184 keys, 17901858 bytes, temperature: kUnknown Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202716036, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17901858, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17833220, "index_size": 37113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 327028, "raw_average_key_size": 26, "raw_value_size": 17626568, "raw_average_value_size": 1446, "num_data_blocks": 1409, "num_entries": 12184, "num_filter_entries": 12184, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.716386) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17901858 bytes Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.718049) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.5 rd, 203.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 16.8 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(80.9) write-amplify(40.3) OK, records in: 12713, records dropped: 529 output_compression: NoCompression Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.718082) EVENT_LOG_v1 {"time_micros": 1771581202718067, "job": 12, "event": "compaction_finished", "compaction_time_micros": 87800, "compaction_time_cpu_micros": 50830, "output_level": 6, "num_output_files": 1, "total_output_size": 17901858, "num_input_records": 12713, "num_output_records": 12184, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202718307, "job": 12, "event": "table_file_deletion", "file_number": 26} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202720678, "job": 12, "event": "table_file_deletion", "file_number": 24} Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.627485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:53:22 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720766) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:53:22 localhost nova_compute[281288]: 2026-02-20 09:53:22.730 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:23 localhost sshd[310906]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:53:24 localhost podman[310908]: 2026-02-20 09:53:24.53282162 +0000 UTC m=+0.069846820 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:53:24 localhost podman[310908]: 2026-02-20 09:53:24.541947537 +0000 UTC m=+0.078972747 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:53:24 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:53:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:25.029 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:25 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:25.074 2 INFO neutron.agent.securitygroups_rpc [None req-9d454723-199e-4c87-997c-435a75780787 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:25.324 264355 INFO neutron.agent.linux.ip_lib [None req-13bbaa62-f04c-4d9b-baf6-53ade5713af0 - - - - - -] Device tapf687f8e2-05 cannot be used as it has no MAC address#033[00m Feb 20 04:53:25 localhost nova_compute[281288]: 2026-02-20 09:53:25.391 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:25 localhost kernel: device tapf687f8e2-05 entered promiscuous mode Feb 20 04:53:25 localhost nova_compute[281288]: 2026-02-20 09:53:25.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:25 localhost NetworkManager[5988]: [1771581205.4026] manager: (tapf687f8e2-05): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Feb 20 04:53:25 localhost ovn_controller[156798]: 2026-02-20T09:53:25Z|00154|binding|INFO|Claiming lport f687f8e2-05bc-41c6-b5b4-d21133776b71 for this chassis. Feb 20 04:53:25 localhost ovn_controller[156798]: 2026-02-20T09:53:25Z|00155|binding|INFO|f687f8e2-05bc-41c6-b5b4-d21133776b71: Claiming unknown Feb 20 04:53:25 localhost systemd-udevd[310942]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:53:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:25.416 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5a2540adf694dd98037b7689be10187', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dffa584b-3b39-44ca-bfd8-0760f34a6a59, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f687f8e2-05bc-41c6-b5b4-d21133776b71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:53:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:25.419 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f687f8e2-05bc-41c6-b5b4-d21133776b71 in datapath a25517b2-5049-4a57-ad98-549dad6f59bf bound to our chassis#033[00m Feb 20 04:53:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:25.422 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7e882f4a-c254-4aea-aa42-5ef07e3c29fe IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:53:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:25.422 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a25517b2-5049-4a57-ad98-549dad6f59bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:53:25 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:25.424 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ab932b0c-eac9-4e06-82d4-835fcd089d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:53:25 localhost ovn_controller[156798]: 2026-02-20T09:53:25Z|00156|binding|INFO|Setting lport f687f8e2-05bc-41c6-b5b4-d21133776b71 ovn-installed in OVS Feb 20 04:53:25 localhost ovn_controller[156798]: 2026-02-20T09:53:25Z|00157|binding|INFO|Setting lport f687f8e2-05bc-41c6-b5b4-d21133776b71 up in Southbound Feb 20 04:53:25 localhost nova_compute[281288]: 2026-02-20 09:53:25.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:25 localhost nova_compute[281288]: 2026-02-20 09:53:25.499 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:25 localhost nova_compute[281288]: 2026-02-20 09:53:25.532 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:25 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:25.544 2 INFO neutron.agent.securitygroups_rpc [None req-1234b8d9-654d-451e-95cb-316b1fc4ede0 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:25 localhost nova_compute[281288]: 2026-02-20 09:53:25.876 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:26 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:26.315 2 INFO neutron.agent.securitygroups_rpc [None req-2cdd2daf-d30d-4deb-a790-e995ba310f91 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:26 localhost podman[310997]: Feb 20 04:53:26 localhost podman[310997]: 2026-02-20 09:53:26.358142509 +0000 UTC m=+0.088521276 container create e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 20 04:53:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:53:26 localhost systemd[1]: Started libpod-conmon-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981.scope. Feb 20 04:53:26 localhost podman[310997]: 2026-02-20 09:53:26.316199696 +0000 UTC m=+0.046578513 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:53:26 localhost systemd[1]: Started libcrun container. Feb 20 04:53:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f122d768a0c33de3c5f1477629b4a40cd49a429ab5028fb1c39b3e5c6badc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:53:26 localhost podman[310997]: 2026-02-20 09:53:26.438766865 +0000 UTC m=+0.169145632 container init e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:53:26 localhost dnsmasq[311027]: started, version 2.85 cachesize 150 Feb 20 04:53:26 localhost dnsmasq[311027]: DNS service limited to local subnets Feb 20 04:53:26 localhost dnsmasq[311027]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:53:26 localhost dnsmasq[311027]: warning: no upstream servers configured Feb 20 04:53:26 localhost dnsmasq-dhcp[311027]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:53:26 localhost dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 0 addresses Feb 20 04:53:26 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host Feb 20 04:53:26 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts Feb 20 04:53:26 localhost podman[311012]: 2026-02-20 09:53:26.4850933 +0000 UTC m=+0.084735241 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, distribution-scope=public, version=9.7) Feb 20 04:53:26 localhost podman[310997]: 2026-02-20 09:53:26.505992605 +0000 UTC m=+0.236371322 container start e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 20 04:53:26 localhost podman[311012]: 2026-02-20 09:53:26.526240509 +0000 UTC m=+0.125882510 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7) Feb 20 04:53:26 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:53:26 localhost openstack_network_exporter[244414]: ERROR 09:53:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:53:26 localhost openstack_network_exporter[244414]: Feb 20 04:53:26 localhost openstack_network_exporter[244414]: ERROR 09:53:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:53:26 localhost openstack_network_exporter[244414]: Feb 20 04:53:26 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:26.718 264355 INFO neutron.agent.dhcp.agent [None req-43377fec-b530-4ba7-bacd-9799b71bae15 - - - - - -] DHCP configuration for ports {'533631ef-8ab5-4d0a-a021-5d8f4521578b'} is completed#033[00m Feb 20 04:53:27 localhost nova_compute[281288]: 2026-02-20 09:53:27.236 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:27 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:27.302 2 INFO neutron.agent.securitygroups_rpc [None req-170d638f-d647-4f80-a7a3-f133bd9dbf7c 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:27 localhost systemd[1]: tmp-crun.Ua84ZA.mount: Deactivated successfully. Feb 20 04:53:27 localhost nova_compute[281288]: 2026-02-20 09:53:27.762 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:28.422 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:53:28Z, description=, device_id=c60b906a-b861-42f1-98c4-c7541cbc3cf5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=70f9ae8f-4400-4da9-92b5-3befab79f396, ip_allocation=immediate, mac_address=fa:16:3e:33:28:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:53:23Z, description=, dns_domain=, id=a25517b2-5049-4a57-ad98-549dad6f59bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-1446003171-network, port_security_enabled=True, project_id=c5a2540adf694dd98037b7689be10187, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12102, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1222, status=ACTIVE, subnets=['a7e1a7af-4593-4a29-a16f-95ab101a15e7'], tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:23Z, vlan_transparent=None, network_id=a25517b2-5049-4a57-ad98-549dad6f59bf, port_security_enabled=False, project_id=c5a2540adf694dd98037b7689be10187, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1258, status=DOWN, tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:28Z on network a25517b2-5049-4a57-ad98-549dad6f59bf#033[00m Feb 20 04:53:28 localhost podman[311054]: 2026-02-20 09:53:28.648540839 +0000 UTC m=+0.062962612 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:53:28 localhost dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 1 addresses Feb 20 04:53:28 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host Feb 20 04:53:28 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts Feb 20 04:53:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:28.867 264355 INFO neutron.agent.dhcp.agent [None req-8949ddb7-38c0-4438-b691-395532cd4240 - - - - - -] DHCP configuration for ports {'70f9ae8f-4400-4da9-92b5-3befab79f396'} is completed#033[00m Feb 20 04:53:28 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:28.909 2 INFO neutron.agent.securitygroups_rpc [None req-c8a66b7a-94e3-4469-9bdc-a709861d759e 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:29.296 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:53:28Z, description=, device_id=c60b906a-b861-42f1-98c4-c7541cbc3cf5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=70f9ae8f-4400-4da9-92b5-3befab79f396, ip_allocation=immediate, mac_address=fa:16:3e:33:28:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:53:23Z, description=, dns_domain=, id=a25517b2-5049-4a57-ad98-549dad6f59bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-1446003171-network, port_security_enabled=True, project_id=c5a2540adf694dd98037b7689be10187, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12102, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1222, status=ACTIVE, subnets=['a7e1a7af-4593-4a29-a16f-95ab101a15e7'], tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:23Z, vlan_transparent=None, network_id=a25517b2-5049-4a57-ad98-549dad6f59bf, port_security_enabled=False, project_id=c5a2540adf694dd98037b7689be10187, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1258, status=DOWN, tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:28Z on network a25517b2-5049-4a57-ad98-549dad6f59bf#033[00m Feb 20 04:53:29 localhost dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 1 addresses Feb 20 04:53:29 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host Feb 20 04:53:29 localhost podman[311093]: 2026-02-20 09:53:29.497743292 +0000 UTC m=+0.063865859 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:53:29 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts Feb 20 04:53:29 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:29.554 2 INFO neutron.agent.securitygroups_rpc [None req-47b66a4e-d860-4539-a526-725ae67efd11 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:29.722 264355 INFO neutron.agent.dhcp.agent [None req-54eea36c-7112-4fb3-897a-62ef0cd96ce8 - - - - - -] DHCP configuration for ports {'70f9ae8f-4400-4da9-92b5-3befab79f396'} is completed#033[00m Feb 20 04:53:30 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 20 04:53:30 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:30.457 2 INFO neutron.agent.securitygroups_rpc [None req-11ba55d2-9392-4bf8-866e-0f6b7421a111 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:30 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:30.868 2 INFO neutron.agent.securitygroups_rpc [None req-813a578a-3c42-4faa-a8fa-18fef9f75e4f 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:30 localhost nova_compute[281288]: 2026-02-20 09:53:30.877 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:31 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:31.469 2 INFO neutron.agent.securitygroups_rpc [None req-cf8e4aa2-945c-4732-b1d9-06cd52c9d8b9 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:53:32 localhost podman[311115]: 2026-02-20 09:53:32.153375 +0000 UTC m=+0.090781995 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller) Feb 20 04:53:32 localhost podman[311115]: 2026-02-20 09:53:32.203177912 +0000 UTC m=+0.140584947 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:53:32 localhost podman[311116]: 2026-02-20 09:53:32.216860697 +0000 UTC m=+0.150901830 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:53:32 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:53:32 localhost podman[311116]: 2026-02-20 09:53:32.231104849 +0000 UTC m=+0.165145982 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:53:32 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:53:32 localhost sshd[311158]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:32 localhost nova_compute[281288]: 2026-02-20 09:53:32.765 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:32 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:32.884 2 INFO neutron.agent.securitygroups_rpc [None req-0ba4bd77-300a-4aef-8bf9-70b27ff0d0d5 eedc91db7da847aab912b3b8401d5b18 8d5c2f81bbf4423c8ccdbeb44081c499 - - default default] Security group member updated ['943c86ba-7264-4974-89ae-938b95d72620']#033[00m Feb 20 04:53:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:33.258 2 INFO neutron.agent.securitygroups_rpc [None req-e49f1726-acc6-4366-8506-477a12f2a7e4 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:33.259 2 INFO neutron.agent.securitygroups_rpc [None req-bbdf9d9d-afcb-4396-b80a-79eb3001d8e5 eedc91db7da847aab912b3b8401d5b18 8d5c2f81bbf4423c8ccdbeb44081c499 - - default default] Security group member updated ['943c86ba-7264-4974-89ae-938b95d72620']#033[00m Feb 20 04:53:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:33.963 2 INFO neutron.agent.securitygroups_rpc [None req-55d03934-9517-419a-915b-7eb31a90c9a3 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:34 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:34.366 2 INFO neutron.agent.securitygroups_rpc [None req-b58a03ae-8801-426a-8262-b9afab11fa37 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']#033[00m Feb 20 04:53:34 localhost dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 0 addresses Feb 20 04:53:34 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host Feb 20 04:53:34 localhost dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts Feb 20 04:53:34 localhost podman[311177]: 2026-02-20 09:53:34.688981038 +0000 UTC m=+0.060674022 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:53:34 localhost ovn_controller[156798]: 2026-02-20T09:53:34Z|00158|binding|INFO|Releasing lport f687f8e2-05bc-41c6-b5b4-d21133776b71 from this chassis (sb_readonly=0) Feb 20 04:53:34 localhost kernel: device tapf687f8e2-05 left promiscuous mode Feb 20 04:53:34 localhost ovn_controller[156798]: 2026-02-20T09:53:34Z|00159|binding|INFO|Setting lport f687f8e2-05bc-41c6-b5b4-d21133776b71 down in Southbound Feb 20 04:53:34 localhost nova_compute[281288]: 2026-02-20 09:53:34.863 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:34.877 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5a2540adf694dd98037b7689be10187', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dffa584b-3b39-44ca-bfd8-0760f34a6a59, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f687f8e2-05bc-41c6-b5b4-d21133776b71) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:53:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:34.879 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f687f8e2-05bc-41c6-b5b4-d21133776b71 in datapath a25517b2-5049-4a57-ad98-549dad6f59bf unbound from our chassis#033[00m Feb 20 04:53:34 localhost nova_compute[281288]: 2026-02-20 09:53:34.880 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:34 localhost nova_compute[281288]: 2026-02-20 09:53:34.882 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:34.883 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a25517b2-5049-4a57-ad98-549dad6f59bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:53:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:34.884 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[75771f41-cb0f-4d14-8be3-57319846f24a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:53:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:35 localhost nova_compute[281288]: 2026-02-20 09:53:35.913 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:36.131 264355 INFO neutron.agent.linux.ip_lib [None req-d35a7a42-6de1-4598-9fef-1364e6632d43 - - - - - -] Device tap5a613b4f-1d cannot be used as it has no MAC address#033[00m Feb 20 04:53:36 localhost nova_compute[281288]: 2026-02-20 09:53:36.159 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:36 localhost kernel: device tap5a613b4f-1d entered promiscuous mode Feb 20 04:53:36 localhost NetworkManager[5988]: [1771581216.1711] manager: (tap5a613b4f-1d): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Feb 20 04:53:36 localhost ovn_controller[156798]: 2026-02-20T09:53:36Z|00160|binding|INFO|Claiming lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 for this chassis. Feb 20 04:53:36 localhost ovn_controller[156798]: 2026-02-20T09:53:36Z|00161|binding|INFO|5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422: Claiming unknown Feb 20 04:53:36 localhost nova_compute[281288]: 2026-02-20 09:53:36.173 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:36 localhost systemd-udevd[311211]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:53:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:36.182 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c31c2cb7-7585-4255-8195-898589cc1c5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:53:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:36.184 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 in datapath 4d3d4b22-89ae-4b72-8269-db16bc023693 bound to our chassis#033[00m Feb 20 04:53:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:36.186 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4d3d4b22-89ae-4b72-8269-db16bc023693 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:53:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:36.188 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1d846e24-d841-4d57-aa5f-5822655873cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost ovn_controller[156798]: 2026-02-20T09:53:36Z|00162|binding|INFO|Setting lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 ovn-installed in OVS Feb 20 04:53:36 localhost ovn_controller[156798]: 2026-02-20T09:53:36Z|00163|binding|INFO|Setting lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 up in Southbound Feb 20 04:53:36 localhost nova_compute[281288]: 2026-02-20 09:53:36.209 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost journal[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device Feb 20 04:53:36 localhost nova_compute[281288]: 2026-02-20 09:53:36.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:36 localhost nova_compute[281288]: 2026-02-20 09:53:36.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:37 localhost podman[311350]: Feb 20 04:53:37 localhost podman[311350]: 2026-02-20 09:53:37.152436195 +0000 UTC m=+0.094227349 container create e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:53:37 localhost systemd[1]: Started libpod-conmon-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0.scope. Feb 20 04:53:37 localhost podman[311350]: 2026-02-20 09:53:37.100738067 +0000 UTC m=+0.042529311 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:53:37 localhost systemd[1]: tmp-crun.yFDNIv.mount: Deactivated successfully. Feb 20 04:53:37 localhost systemd[1]: Started libcrun container. Feb 20 04:53:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f09de51b9fc12b68b5d61385aaa744ed437b92490600eef9db4abfcb8fb69f0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:53:37 localhost podman[311350]: 2026-02-20 09:53:37.238112365 +0000 UTC m=+0.179903519 container init e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:53:37 localhost podman[311350]: 2026-02-20 09:53:37.24523183 +0000 UTC m=+0.187023014 container start e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:53:37 localhost dnsmasq[311368]: started, version 2.85 cachesize 150 Feb 20 04:53:37 localhost dnsmasq[311368]: DNS service limited to local subnets Feb 20 04:53:37 localhost dnsmasq[311368]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:53:37 localhost dnsmasq[311368]: warning: no upstream servers configured Feb 20 04:53:37 localhost dnsmasq-dhcp[311368]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Feb 20 04:53:37 localhost dnsmasq[311368]: read /var/lib/neutron/dhcp/4d3d4b22-89ae-4b72-8269-db16bc023693/addn_hosts - 0 addresses Feb 20 04:53:37 localhost dnsmasq-dhcp[311368]: read /var/lib/neutron/dhcp/4d3d4b22-89ae-4b72-8269-db16bc023693/host Feb 20 04:53:37 localhost dnsmasq-dhcp[311368]: read /var/lib/neutron/dhcp/4d3d4b22-89ae-4b72-8269-db16bc023693/opts Feb 20 04:53:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:37.408 264355 INFO neutron.agent.dhcp.agent [None req-337407bb-513c-414c-ae67-9214459e5909 - - - - - -] DHCP configuration for ports {'ff22c793-cbfd-4069-ba90-fc7d2d6ce756'} is completed#033[00m Feb 20 04:53:37 localhost nova_compute[281288]: 2026-02-20 09:53:37.808 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:37 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:53:37 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:53:38 localhost ovn_controller[156798]: 2026-02-20T09:53:38Z|00164|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:53:38 localhost nova_compute[281288]: 2026-02-20 09:53:38.296 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:39 localhost podman[311402]: 2026-02-20 09:53:39.035345761 +0000 UTC m=+0.066020814 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:53:39 localhost dnsmasq[311027]: exiting on receipt of SIGTERM Feb 20 04:53:39 localhost systemd[1]: tmp-crun.XgagWJ.mount: Deactivated successfully. Feb 20 04:53:39 localhost systemd[1]: libpod-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981.scope: Deactivated successfully. Feb 20 04:53:39 localhost podman[311418]: 2026-02-20 09:53:39.117967248 +0000 UTC m=+0.063372424 container died e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:53:39 localhost podman[311418]: 2026-02-20 09:53:39.149452343 +0000 UTC m=+0.094857519 container cleanup e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:53:39 localhost systemd[1]: libpod-conmon-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981.scope: Deactivated successfully. Feb 20 04:53:39 localhost systemd[1]: var-lib-containers-storage-overlay-d8f122d768a0c33de3c5f1477629b4a40cd49a429ab5028fb1c39b3e5c6badc0-merged.mount: Deactivated successfully. Feb 20 04:53:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981-userdata-shm.mount: Deactivated successfully. Feb 20 04:53:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:53:39 localhost podman[311419]: 2026-02-20 09:53:39.188142847 +0000 UTC m=+0.131377307 container remove e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:53:39 localhost systemd[1]: run-netns-qdhcp\x2da25517b2\x2d5049\x2d4a57\x2dad98\x2d549dad6f59bf.mount: Deactivated successfully. Feb 20 04:53:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:39.226 264355 INFO neutron.agent.dhcp.agent [None req-cae3916f-00a6-4868-8b5d-8a4a4171ef81 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:39.549 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:40 localhost nova_compute[281288]: 2026-02-20 09:53:40.948 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:53:41 localhost podman[311446]: 2026-02-20 09:53:41.147979086 +0000 UTC m=+0.084633069 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:53:41 localhost podman[311446]: 2026-02-20 09:53:41.160193956 +0000 UTC m=+0.096847959 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 20 04:53:41 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:53:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:41.199 264355 INFO neutron.agent.linux.ip_lib [None req-af1bf122-f499-4e2b-a4de-2605dede576f - - - - - -] Device tap470cc6c5-ef cannot be used as it has no MAC address#033[00m Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.226 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost kernel: device tap470cc6c5-ef entered promiscuous mode Feb 20 04:53:41 localhost NetworkManager[5988]: [1771581221.2359] manager: (tap470cc6c5-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.236 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost ovn_controller[156798]: 2026-02-20T09:53:41Z|00165|binding|INFO|Claiming lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 for this chassis. Feb 20 04:53:41 localhost ovn_controller[156798]: 2026-02-20T09:53:41Z|00166|binding|INFO|470cc6c5-ef78-4a24-869b-d34965cd09b3: Claiming unknown Feb 20 04:53:41 localhost systemd-udevd[311476]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.249 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46b3b5fc-1b08-4417-b622-f776fc49a8cc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=470cc6c5-ef78-4a24-869b-d34965cd09b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.251 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 470cc6c5-ef78-4a24-869b-d34965cd09b3 in datapath 567afd59-5f4e-4b3f-84e1-87f98341e0f7 bound to our chassis#033[00m Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.252 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 567afd59-5f4e-4b3f-84e1-87f98341e0f7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.253 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[65e924c6-3284-456c-9e75-182c962f5cc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost ovn_controller[156798]: 2026-02-20T09:53:41Z|00167|binding|INFO|Setting lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 ovn-installed in OVS Feb 20 04:53:41 localhost ovn_controller[156798]: 2026-02-20T09:53:41Z|00168|binding|INFO|Setting lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 up in Southbound Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.281 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost journal[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.319 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.348 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost ovn_controller[156798]: 2026-02-20T09:53:41Z|00169|binding|INFO|Removing iface tap470cc6c5-ef ovn-installed in OVS Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.881 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 947ddfc5-c53d-4d85-b4de-5bbea365f011 with type ""#033[00m Feb 20 04:53:41 localhost ovn_controller[156798]: 2026-02-20T09:53:41Z|00170|binding|INFO|Removing lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 ovn-installed in OVS Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.882 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46b3b5fc-1b08-4417-b622-f776fc49a8cc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=470cc6c5-ef78-4a24-869b-d34965cd09b3) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.882 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.885 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 470cc6c5-ef78-4a24-869b-d34965cd09b3 in datapath 567afd59-5f4e-4b3f-84e1-87f98341e0f7 unbound from our chassis#033[00m Feb 20 04:53:41 localhost nova_compute[281288]: 2026-02-20 09:53:41.885 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.887 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 567afd59-5f4e-4b3f-84e1-87f98341e0f7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:53:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:41.887 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ff1356-1a35-4cef-896b-c14614e5605c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:53:42 localhost ovn_controller[156798]: 2026-02-20T09:53:42Z|00171|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:53:42 localhost nova_compute[281288]: 2026-02-20 09:53:42.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:42 localhost podman[311545]: Feb 20 04:53:42 localhost podman[311545]: 2026-02-20 09:53:42.117416388 +0000 UTC m=+0.119959041 container create dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:53:42 localhost podman[311545]: 2026-02-20 09:53:42.046056282 +0000 UTC m=+0.048598965 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:53:42 localhost systemd[1]: Started libpod-conmon-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347.scope. Feb 20 04:53:42 localhost systemd[1]: tmp-crun.yAAPuR.mount: Deactivated successfully. Feb 20 04:53:42 localhost systemd[1]: Started libcrun container. Feb 20 04:53:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb973c786a4a0e773abfd175e84104553d0c9274294a3974cf56eb42cc926635/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:53:42 localhost podman[311545]: 2026-02-20 09:53:42.18377077 +0000 UTC m=+0.186313433 container init dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:53:42 localhost podman[311545]: 2026-02-20 09:53:42.193766794 +0000 UTC m=+0.196309447 container start dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:53:42 localhost dnsmasq[311563]: started, version 2.85 cachesize 150 Feb 20 04:53:42 localhost dnsmasq[311563]: DNS service limited to local subnets Feb 20 04:53:42 localhost dnsmasq[311563]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:53:42 localhost dnsmasq[311563]: warning: no upstream servers configured Feb 20 04:53:42 localhost dnsmasq-dhcp[311563]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:53:42 localhost dnsmasq[311563]: read /var/lib/neutron/dhcp/567afd59-5f4e-4b3f-84e1-87f98341e0f7/addn_hosts - 0 addresses Feb 20 04:53:42 localhost dnsmasq-dhcp[311563]: read /var/lib/neutron/dhcp/567afd59-5f4e-4b3f-84e1-87f98341e0f7/host Feb 20 04:53:42 localhost dnsmasq-dhcp[311563]: read /var/lib/neutron/dhcp/567afd59-5f4e-4b3f-84e1-87f98341e0f7/opts Feb 20 04:53:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:42.267 264355 INFO neutron.agent.dhcp.agent [None req-fd94e593-7a9b-43c1-b2e6-a3e510d8b525 - - - - - -] DHCP configuration for ports {'dd020ec0-dd49-42b0-8f23-d87283043b3d'} is completed#033[00m Feb 20 04:53:42 localhost dnsmasq[311563]: exiting on receipt of SIGTERM Feb 20 04:53:42 localhost podman[311581]: 2026-02-20 09:53:42.422861965 +0000 UTC m=+0.058640920 container kill dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:53:42 localhost systemd[1]: libpod-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347.scope: Deactivated successfully. Feb 20 04:53:42 localhost podman[311594]: 2026-02-20 09:53:42.490457396 +0000 UTC m=+0.054997520 container died dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:53:42 localhost podman[311594]: 2026-02-20 09:53:42.520257079 +0000 UTC m=+0.084797163 container cleanup dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:53:42 localhost systemd[1]: libpod-conmon-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347.scope: Deactivated successfully. Feb 20 04:53:42 localhost podman[311596]: 2026-02-20 09:53:42.567695448 +0000 UTC m=+0.123612561 container remove dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:53:42 localhost kernel: device tap470cc6c5-ef left promiscuous mode Feb 20 04:53:42 localhost nova_compute[281288]: 2026-02-20 09:53:42.579 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:42 localhost nova_compute[281288]: 2026-02-20 09:53:42.593 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:42.619 264355 INFO neutron.agent.dhcp.agent [None req-a584c831-5291-46d0-94ac-116dd4400218 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:42.620 264355 INFO neutron.agent.dhcp.agent [None req-a584c831-5291-46d0-94ac-116dd4400218 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:42 localhost nova_compute[281288]: 2026-02-20 09:53:42.811 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:43 localhost systemd[1]: var-lib-containers-storage-overlay-eb973c786a4a0e773abfd175e84104553d0c9274294a3974cf56eb42cc926635-merged.mount: Deactivated successfully. Feb 20 04:53:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347-userdata-shm.mount: Deactivated successfully. Feb 20 04:53:43 localhost systemd[1]: run-netns-qdhcp\x2d567afd59\x2d5f4e\x2d4b3f\x2d84e1\x2d87f98341e0f7.mount: Deactivated successfully. Feb 20 04:53:43 localhost nova_compute[281288]: 2026-02-20 09:53:43.546 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.495 264355 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 20 04:53:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.617 264355 INFO neutron.agent.dhcp.agent [None req-8fbabde5-8c48-45b8-bf8d-fe8f256e795b - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:53:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.618 264355 INFO neutron.agent.dhcp.agent [-] Starting network a6178b53-adde-4e45-a5fb-ba3e8333d1f7 dhcp configuration#033[00m Feb 20 04:53:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.618 264355 INFO neutron.agent.dhcp.agent [-] Finished network a6178b53-adde-4e45-a5fb-ba3e8333d1f7 dhcp configuration#033[00m Feb 20 04:53:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.619 264355 INFO neutron.agent.dhcp.agent [None req-8fbabde5-8c48-45b8-bf8d-fe8f256e795b - - - - - -] Synchronizing state complete#033[00m Feb 20 04:53:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.619 264355 INFO neutron.agent.dhcp.agent [None req-757a0fa7-eac3-47ea-9d4e-70afd5a945ff - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:45 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:45.179 2 INFO neutron.agent.securitygroups_rpc [None req-a4da6bb1-700c-4d71-a646-fe34335ad1c4 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:45.184 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:45 localhost nova_compute[281288]: 2026-02-20 09:53:45.783 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:45 localhost nova_compute[281288]: 2026-02-20 09:53:45.950 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:46 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:46.461 2 INFO neutron.agent.securitygroups_rpc [None req-82d4853b-8792-42bf-a9bd-621206147606 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:47 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:47.540 2 INFO neutron.agent.securitygroups_rpc [None req-2045a801-a884-4a06-b206-987ac9e8d82c 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:47 localhost podman[241968]: time="2026-02-20T09:53:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:53:47 localhost podman[241968]: @ - - [20/Feb/2026:09:53:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 20 04:53:47 localhost podman[241968]: @ - - [20/Feb/2026:09:53:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18808 "" "Go-http-client/1.1" Feb 20 04:53:47 localhost nova_compute[281288]: 2026-02-20 09:53:47.813 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:47 localhost ovn_controller[156798]: 2026-02-20T09:53:47Z|00172|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:53:47 localhost nova_compute[281288]: 2026-02-20 09:53:47.961 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:48 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e113 e113: 6 total, 6 up, 6 in Feb 20 04:53:48 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:48.550 2 INFO neutron.agent.securitygroups_rpc [None req-34460107-5767-4790-bb51-43f170627a06 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:48 localhost ovn_controller[156798]: 2026-02-20T09:53:48Z|00173|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:53:49 localhost nova_compute[281288]: 2026-02-20 09:53:49.010 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:49 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:49.997 2 INFO neutron.agent.securitygroups_rpc [None req-f7428e6a-a4fd-4f95-a528-55a12406007a 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:50 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:50 localhost nova_compute[281288]: 2026-02-20 09:53:50.995 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:53:51 localhost podman[311624]: 2026-02-20 09:53:51.152056509 +0000 UTC m=+0.086570038 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:53:51 localhost podman[311624]: 2026-02-20 09:53:51.166126375 +0000 UTC m=+0.100639964 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:53:51 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:53:51 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:51.242 2 INFO neutron.agent.securitygroups_rpc [None req-e18dac8b-7691-49db-a6f9-a9ef86b93bee 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.791 264355 INFO neutron.agent.dhcp.agent [None req-8fbabde5-8c48-45b8-bf8d-fe8f256e795b - - - - - -] Synchronizing state#033[00m Feb 20 04:53:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.939 264355 INFO neutron.agent.dhcp.agent [None req-99c0b8ff-a1dd-4d14-8ec8-9a6b4f78d738 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:53:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.940 264355 INFO neutron.agent.dhcp.agent [-] Starting network cd099dbb-ee85-46d6-aee0-e12b432c9b7f dhcp configuration#033[00m Feb 20 04:53:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.941 264355 INFO neutron.agent.dhcp.agent [-] Finished network cd099dbb-ee85-46d6-aee0-e12b432c9b7f dhcp configuration#033[00m Feb 20 04:53:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.942 264355 INFO neutron.agent.dhcp.agent [None req-99c0b8ff-a1dd-4d14-8ec8-9a6b4f78d738 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:53:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.942 264355 INFO neutron.agent.dhcp.agent [None req-56328f20-7238-4e1a-b62d-244eb3392de4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e114 e114: 6 total, 6 up, 6 in Feb 20 04:53:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:52.515 2 INFO neutron.agent.securitygroups_rpc [None req-debccdd3-a1fb-4577-8737-97107297a2b7 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:52 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:53:52.531 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:53:52 localhost nova_compute[281288]: 2026-02-20 09:53:52.817 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:53.059 2 INFO neutron.agent.securitygroups_rpc [None req-f48de32a-1488-41ca-9ac7-21eba1b907c6 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e115 e115: 6 total, 6 up, 6 in Feb 20 04:53:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:53.497 2 INFO neutron.agent.securitygroups_rpc [None req-2540619c-1d57-4e86-a386-76258833753f 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:54 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:54.638 2 INFO neutron.agent.securitygroups_rpc [None req-6b4ba7bd-a2d9-4ae3-ba6a-627aca1feb8f 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:54 localhost nova_compute[281288]: 2026-02-20 09:53:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:54 localhost nova_compute[281288]: 2026-02-20 09:53:54.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:53:54 localhost nova_compute[281288]: 2026-02-20 09:53:54.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:53:54 localhost nova_compute[281288]: 2026-02-20 09:53:54.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:53:54 localhost nova_compute[281288]: 2026-02-20 09:53:54.752 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:53:54 localhost nova_compute[281288]: 2026-02-20 09:53:54.752 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:53:55 localhost sshd[311667]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:53:55 localhost systemd[1]: tmp-crun.mYSorS.mount: Deactivated successfully. Feb 20 04:53:55 localhost podman[311669]: 2026-02-20 09:53:55.144543866 +0000 UTC m=+0.085529156 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:53:55 localhost podman[311669]: 2026-02-20 09:53:55.152366544 +0000 UTC m=+0.093351844 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:53:55 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:53:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:53:55 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2307489764' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.262 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.332 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.333 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:53:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e116 e116: 6 total, 6 up, 6 in Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.568 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.569 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11353MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.570 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.570 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:53:55 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:55.612 2 INFO neutron.agent.securitygroups_rpc [None req-11ea8750-a0ca-4484-988a-d6e41cea3e7c 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.680 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.680 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.681 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:53:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:53:55 localhost nova_compute[281288]: 2026-02-20 09:53:55.736 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.030 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:53:56 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2163954098' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.224 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.230 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.415 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.418 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.418 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:53:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e117 e117: 6 total, 6 up, 6 in Feb 20 04:53:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:56.506 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:53:56 localhost nova_compute[281288]: 2026-02-20 09:53:56.507 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:53:56.507 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:53:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:53:56.515 2 INFO neutron.agent.securitygroups_rpc [None req-2195fa93-d724-491c-94cb-1a1a7da48d3e 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']#033[00m Feb 20 04:53:56 localhost openstack_network_exporter[244414]: ERROR 09:53:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:53:56 localhost openstack_network_exporter[244414]: Feb 20 04:53:56 localhost openstack_network_exporter[244414]: ERROR 09:53:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:53:56 localhost openstack_network_exporter[244414]: Feb 20 04:53:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:53:57 localhost podman[311718]: 2026-02-20 09:53:57.14540514 +0000 UTC m=+0.084107523 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, distribution-scope=public, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-type=git) Feb 20 04:53:57 localhost podman[311718]: 2026-02-20 09:53:57.162141008 +0000 UTC m=+0.100843411 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, vendor=Red Hat, Inc.) Feb 20 04:53:57 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:53:57 localhost nova_compute[281288]: 2026-02-20 09:53:57.415 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e118 e118: 6 total, 6 up, 6 in Feb 20 04:53:57 localhost nova_compute[281288]: 2026-02-20 09:53:57.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:57 localhost nova_compute[281288]: 2026-02-20 09:53:57.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e119 e119: 6 total, 6 up, 6 in Feb 20 04:53:57 localhost nova_compute[281288]: 2026-02-20 09:53:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:57 localhost nova_compute[281288]: 2026-02-20 09:53:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:57 localhost nova_compute[281288]: 2026-02-20 09:53:57.819 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:53:58 localhost nova_compute[281288]: 2026-02-20 09:53:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:53:58 localhost nova_compute[281288]: 2026-02-20 09:53:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:00 localhost nova_compute[281288]: 2026-02-20 09:54:00.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:00 localhost nova_compute[281288]: 2026-02-20 09:54:00.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:54:00 localhost nova_compute[281288]: 2026-02-20 09:54:00.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:54:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.092 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.094 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:54:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:54:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:54:04 localhost ovn_controller[156798]: 2026-02-20T09:54:04Z|00174|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:04 localhost podman[311738]: 2026-02-20 09:54:04.153876462 +0000 UTC m=+0.047138764 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:54:04 localhost podman[311738]: 2026-02-20 09:54:04.202049396 +0000 UTC m=+0.095311678 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Feb 20 04:54:04 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:54:04 localhost podman[311739]: 2026-02-20 09:54:04.21303661 +0000 UTC m=+0.103877438 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:54:04 localhost podman[311739]: 2026-02-20 09:54:04.219870947 +0000 UTC m=+0.110711765 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Feb 20 04:54:04 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:54:04 localhost ovn_controller[156798]: 2026-02-20T09:54:04Z|00175|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:04 localhost ovn_controller[156798]: 2026-02-20T09:54:04Z|00176|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.677 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:04 localhost ovn_controller[156798]: 2026-02-20T09:54:04Z|00177|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.920 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.920 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.921 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.921 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:54:04 localhost nova_compute[281288]: 2026-02-20 09:54:04.922 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:05.509 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:54:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:54:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:54:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:54:06 localhost ovn_controller[156798]: 2026-02-20T09:54:06Z|00178|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:06 localhost nova_compute[281288]: 2026-02-20 09:54:06.284 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:06 localhost ovn_controller[156798]: 2026-02-20T09:54:06Z|00179|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:06 localhost nova_compute[281288]: 2026-02-20 09:54:06.398 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:54:06 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1081899576' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:54:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:54:06 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1081899576' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:54:06 localhost ovn_controller[156798]: 2026-02-20T09:54:06Z|00180|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:06 localhost nova_compute[281288]: 2026-02-20 09:54:06.885 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:07 localhost nova_compute[281288]: 2026-02-20 09:54:07.036 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:54:07 localhost nova_compute[281288]: 2026-02-20 09:54:07.056 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:54:07 localhost nova_compute[281288]: 2026-02-20 09:54:07.057 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:54:07 localhost nova_compute[281288]: 2026-02-20 09:54:07.057 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:07 localhost nova_compute[281288]: 2026-02-20 09:54:07.057 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:54:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e120 e120: 6 total, 6 up, 6 in Feb 20 04:54:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e121 e121: 6 total, 6 up, 6 in Feb 20 04:54:08 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:08.496 2 INFO neutron.agent.securitygroups_rpc [None req-b9c4f92c-e0aa-4ddd-a393-48b8bc5d6b0b 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['b7daa996-a450-47f2-a46b-44613b415203']#033[00m Feb 20 04:54:08 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:08.646 2 INFO neutron.agent.securitygroups_rpc [None req-fa8e1861-ba97-4550-97ae-0d61a37c286d 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['b7daa996-a450-47f2-a46b-44613b415203']#033[00m Feb 20 04:54:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e122 e122: 6 total, 6 up, 6 in Feb 20 04:54:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:54:08 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/523875761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:54:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:54:08 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/523875761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:54:09 localhost nova_compute[281288]: 2026-02-20 09:54:09.095 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:09 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:09.142 2 INFO neutron.agent.securitygroups_rpc [None req-60615bbe-69b8-4a6d-a777-f3532d76e589 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:10 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:10.726 2 INFO neutron.agent.securitygroups_rpc [None req-05c5180a-791e-4a36-b283-1b3700162f32 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:11.072 2 INFO neutron.agent.securitygroups_rpc [None req-fda66745-7058-43a6-bd22-7e541948feca 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:11.376 2 INFO neutron.agent.securitygroups_rpc [None req-484ad5d4-98d6-44d1-ade8-e83a00451c3f 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:11.848 2 INFO neutron.agent.securitygroups_rpc [None req-d28a91c5-19c2-44e5-9f1d-5b67d2d402bb 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:12.048 2 INFO neutron.agent.securitygroups_rpc [None req-afec8fb8-e434-475a-b1f9-4fa41fdb543f 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:54:12 localhost podman[311784]: 2026-02-20 09:54:12.140437625 +0000 UTC m=+0.081466812 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 20 04:54:12 localhost podman[311784]: 2026-02-20 09:54:12.148953664 +0000 UTC m=+0.089982821 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:54:12 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:54:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:12.313 2 INFO neutron.agent.securitygroups_rpc [None req-ce85a640-3696-4e3b-b081-e77c6a0d5165 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:12.475 2 INFO neutron.agent.securitygroups_rpc [None req-2e25a288-cc52-4176-8755-11e2b4f58624 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e123 e123: 6 total, 6 up, 6 in Feb 20 04:54:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:12.772 2 INFO neutron.agent.securitygroups_rpc [None req-8a952836-35e7-4a3b-8a56-14552dc3796b 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:13.303 264355 INFO neutron.agent.linux.ip_lib [None req-6eb1cc23-1640-4bc7-902e-ac5ab4859a19 - - - - - -] Device tapc66aa099-67 cannot be used as it has no MAC address#033[00m Feb 20 04:54:13 localhost nova_compute[281288]: 2026-02-20 09:54:13.326 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:13 localhost kernel: device tapc66aa099-67 entered promiscuous mode Feb 20 04:54:13 localhost nova_compute[281288]: 2026-02-20 09:54:13.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:13 localhost NetworkManager[5988]: [1771581253.3365] manager: (tapc66aa099-67): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Feb 20 04:54:13 localhost ovn_controller[156798]: 2026-02-20T09:54:13Z|00181|binding|INFO|Claiming lport c66aa099-6743-4204-8797-c164cf332f51 for this chassis. Feb 20 04:54:13 localhost ovn_controller[156798]: 2026-02-20T09:54:13Z|00182|binding|INFO|c66aa099-6743-4204-8797-c164cf332f51: Claiming unknown Feb 20 04:54:13 localhost systemd-udevd[311814]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:54:13 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:13.346 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e50d36cf-02a1-45d9-8e27-acc0fe1139fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c66aa099-6743-4204-8797-c164cf332f51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:13 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:13.349 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c66aa099-6743-4204-8797-c164cf332f51 in datapath ce344b2b-0116-41ad-9d2e-ee513171891f bound to our chassis#033[00m Feb 20 04:54:13 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:13.351 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ce344b2b-0116-41ad-9d2e-ee513171891f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:13 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:13.352 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5221dac8-6f79-4f99-bb52-31a6304a0e92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost ovn_controller[156798]: 2026-02-20T09:54:13Z|00183|binding|INFO|Setting lport c66aa099-6743-4204-8797-c164cf332f51 ovn-installed in OVS Feb 20 04:54:13 localhost ovn_controller[156798]: 2026-02-20T09:54:13Z|00184|binding|INFO|Setting lport c66aa099-6743-4204-8797-c164cf332f51 up in Southbound Feb 20 04:54:13 localhost nova_compute[281288]: 2026-02-20 09:54:13.374 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost journal[229984]: ethtool ioctl error on tapc66aa099-67: No such device Feb 20 04:54:13 localhost nova_compute[281288]: 2026-02-20 09:54:13.419 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:13 localhost nova_compute[281288]: 2026-02-20 09:54:13.448 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:13.692 2 INFO neutron.agent.securitygroups_rpc [None req-72cb9bc1-a00e-467b-ba2c-70207cf7bcb5 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']#033[00m Feb 20 04:54:14 localhost nova_compute[281288]: 2026-02-20 09:54:14.098 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:14 localhost podman[311885]: Feb 20 04:54:14 localhost podman[311885]: 2026-02-20 09:54:14.205029043 +0000 UTC m=+0.066740686 container create 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:14 localhost systemd[1]: Started libpod-conmon-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb.scope. Feb 20 04:54:14 localhost systemd[1]: Started libcrun container. Feb 20 04:54:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d89d075bd4bd3762074982c4b1f354c4380264ee115e953111bb781925418ea1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:54:14 localhost podman[311885]: 2026-02-20 09:54:14.165876795 +0000 UTC m=+0.027588528 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:54:14 localhost podman[311885]: 2026-02-20 09:54:14.271768348 +0000 UTC m=+0.133480021 container init 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:14 localhost podman[311885]: 2026-02-20 09:54:14.281421321 +0000 UTC m=+0.143132994 container start 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:54:14 localhost dnsmasq[311904]: started, version 2.85 cachesize 150 Feb 20 04:54:14 localhost dnsmasq[311904]: DNS service limited to local subnets Feb 20 04:54:14 localhost dnsmasq[311904]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:54:14 localhost dnsmasq[311904]: warning: no upstream servers configured Feb 20 04:54:14 localhost dnsmasq-dhcp[311904]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:54:14 localhost dnsmasq[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/addn_hosts - 0 addresses Feb 20 04:54:14 localhost dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/host Feb 20 04:54:14 localhost dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/opts Feb 20 04:54:14 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:14.409 264355 INFO neutron.agent.dhcp.agent [None req-31651160-d023-4725-9c26-9d793ef8969e - - - - - -] DHCP configuration for ports {'c5fe307a-fee2-41f6-aab2-23f1423edaff'} is completed#033[00m Feb 20 04:54:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:14.479 2 INFO neutron.agent.securitygroups_rpc [None req-0e571d4b-2938-4d22-abb1-e6b913186df6 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['6b2659dc-8adf-40b4-b971-7bc179be3dc5']#033[00m Feb 20 04:54:14 localhost dnsmasq[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/addn_hosts - 0 addresses Feb 20 04:54:14 localhost dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/host Feb 20 04:54:14 localhost dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/opts Feb 20 04:54:14 localhost podman[311921]: 2026-02-20 09:54:14.589730294 +0000 UTC m=+0.060887748 container kill 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:14 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:14.911 264355 INFO neutron.agent.dhcp.agent [None req-17912c8e-9f91-4766-bff5-70038d6a8bd5 - - - - - -] DHCP configuration for ports {'c66aa099-6743-4204-8797-c164cf332f51', 'c5fe307a-fee2-41f6-aab2-23f1423edaff'} is completed#033[00m Feb 20 04:54:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:15.348 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fdc01f3b-a923-4f35-aaea-9adaa7d8d882 with type ""#033[00m Feb 20 04:54:15 localhost ovn_controller[156798]: 2026-02-20T09:54:15Z|00185|binding|INFO|Removing iface tapc66aa099-67 ovn-installed in OVS Feb 20 04:54:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:15.350 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e50d36cf-02a1-45d9-8e27-acc0fe1139fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c66aa099-6743-4204-8797-c164cf332f51) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:15.352 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c66aa099-6743-4204-8797-c164cf332f51 in datapath ce344b2b-0116-41ad-9d2e-ee513171891f unbound from our chassis#033[00m Feb 20 04:54:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:15.354 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce344b2b-0116-41ad-9d2e-ee513171891f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:54:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:15.355 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4aa644-422d-4835-819f-2c0853129461]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:15 localhost ovn_controller[156798]: 2026-02-20T09:54:15Z|00186|binding|INFO|Removing lport c66aa099-6743-4204-8797-c164cf332f51 ovn-installed in OVS Feb 20 04:54:15 localhost nova_compute[281288]: 2026-02-20 09:54:15.385 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:15 localhost dnsmasq[311904]: exiting on receipt of SIGTERM Feb 20 04:54:15 localhost podman[311959]: 2026-02-20 09:54:15.489296316 +0000 UTC m=+0.061033953 container kill 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:54:15 localhost systemd[1]: tmp-crun.bDNkfZ.mount: Deactivated successfully. Feb 20 04:54:15 localhost systemd[1]: libpod-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb.scope: Deactivated successfully. Feb 20 04:54:15 localhost podman[311973]: 2026-02-20 09:54:15.565276451 +0000 UTC m=+0.054420891 container died 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:54:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb-userdata-shm.mount: Deactivated successfully. Feb 20 04:54:15 localhost podman[311973]: 2026-02-20 09:54:15.66347487 +0000 UTC m=+0.152619250 container remove 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:54:15 localhost systemd[1]: libpod-conmon-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb.scope: Deactivated successfully. Feb 20 04:54:15 localhost nova_compute[281288]: 2026-02-20 09:54:15.677 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:15 localhost kernel: device tapc66aa099-67 left promiscuous mode Feb 20 04:54:15 localhost nova_compute[281288]: 2026-02-20 09:54:15.693 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:15 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:15.917 264355 INFO neutron.agent.dhcp.agent [None req-99c0b8ff-a1dd-4d14-8ec8-9a6b4f78d738 - - - - - -] Synchronizing state#033[00m Feb 20 04:54:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:16.036 2 INFO neutron.agent.securitygroups_rpc [None req-9065899c-9819-48e8-b360-946a69906bd9 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['56cc5e29-9f6f-4f35-9ade-42d618bdd35b']#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.087 264355 INFO neutron.agent.dhcp.agent [None req-1a2e3763-4058-4706-bf05-99d8b392f202 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.088 264355 INFO neutron.agent.dhcp.agent [-] Starting network 9e8d3a83-6496-4dd1-b203-39e31016ea09 dhcp configuration#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.089 264355 INFO neutron.agent.dhcp.agent [-] Finished network 9e8d3a83-6496-4dd1-b203-39e31016ea09 dhcp configuration#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.089 264355 INFO neutron.agent.dhcp.agent [-] Starting network ce344b2b-0116-41ad-9d2e-ee513171891f dhcp configuration#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.090 264355 INFO neutron.agent.dhcp.agent [-] Finished network ce344b2b-0116-41ad-9d2e-ee513171891f dhcp configuration#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.091 264355 INFO neutron.agent.dhcp.agent [None req-1a2e3763-4058-4706-bf05-99d8b392f202 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.133 264355 INFO neutron.agent.dhcp.agent [None req-4e0c3fb0-1208-45a7-b513-5624e7f180c6 - - - - - -] DHCP configuration for ports {'c5fe307a-fee2-41f6-aab2-23f1423edaff'} is completed#033[00m Feb 20 04:54:16 localhost systemd[1]: var-lib-containers-storage-overlay-d89d075bd4bd3762074982c4b1f354c4380264ee115e953111bb781925418ea1-merged.mount: Deactivated successfully. Feb 20 04:54:16 localhost systemd[1]: run-netns-qdhcp\x2dce344b2b\x2d0116\x2d41ad\x2d9d2e\x2dee513171891f.mount: Deactivated successfully. Feb 20 04:54:16 localhost ovn_controller[156798]: 2026-02-20T09:54:16Z|00187|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:16 localhost nova_compute[281288]: 2026-02-20 09:54:16.248 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:16.283 2 INFO neutron.agent.securitygroups_rpc [None req-00b37041-3488-4088-b5e4-b424dd1f63aa 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['56cc5e29-9f6f-4f35-9ade-42d618bdd35b']#033[00m Feb 20 04:54:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.696 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:17 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:17.514 2 INFO neutron.agent.securitygroups_rpc [None req-bee57eed-2319-4f09-8acf-ebe401de5df5 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['39c9ea95-070c-4bc4-9287-e329c91de991']#033[00m Feb 20 04:54:17 localhost podman[241968]: time="2026-02-20T09:54:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:54:17 localhost podman[241968]: @ - - [20/Feb/2026:09:54:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 20 04:54:17 localhost podman[241968]: @ - - [20/Feb/2026:09:54:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18808 "" "Go-http-client/1.1" Feb 20 04:54:18 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:18.291 2 INFO neutron.agent.securitygroups_rpc [None req-63f4c14d-d8d8-4887-9902-5c1bd910d46b 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['39c9ea95-070c-4bc4-9287-e329c91de991']#033[00m Feb 20 04:54:19 localhost nova_compute[281288]: 2026-02-20 09:54:19.102 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:19.287 2 INFO neutron.agent.securitygroups_rpc [None req-0eb3d32b-3601-4bfa-a4ec-07263fd65bfe 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']#033[00m Feb 20 04:54:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:19.697 2 INFO neutron.agent.securitygroups_rpc [None req-c48149ca-5e2e-4ce1-9578-04651a281b9c 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']#033[00m Feb 20 04:54:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:20.116 2 INFO neutron.agent.securitygroups_rpc [None req-db7ac1b3-e635-4b44-b582-fc808315e3e2 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']#033[00m Feb 20 04:54:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:20.437 2 INFO neutron.agent.securitygroups_rpc [None req-59de0fbc-5d34-4b72-b4ef-ac2a7c9b1e9d 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']#033[00m Feb 20 04:54:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:20.728 2 INFO neutron.agent.securitygroups_rpc [None req-cd9b335d-ac52-4aea-badd-19c0c11ff6a7 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']#033[00m Feb 20 04:54:21 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:21.033 2 INFO neutron.agent.securitygroups_rpc [None req-eaa2444c-9904-4bf6-912c-ce544aec0944 a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']#033[00m Feb 20 04:54:21 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:21.193 2 INFO neutron.agent.securitygroups_rpc [None req-eef0b483-52c5-455d-8e50-fdf7323c6cd9 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']#033[00m Feb 20 04:54:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:54:22 localhost systemd[1]: tmp-crun.vtyI4Y.mount: Deactivated successfully. Feb 20 04:54:22 localhost podman[311999]: 2026-02-20 09:54:22.14051521 +0000 UTC m=+0.082671220 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:54:22 localhost podman[311999]: 2026-02-20 09:54:22.150116621 +0000 UTC m=+0.092272681 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:54:22 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:54:22 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:22.463 2 INFO neutron.agent.securitygroups_rpc [None req-5a5ae7af-13a6-42c5-a0fc-7ba1957a4294 b9d64681c327441a81dfa771b4b413f6 ce97c44a73f94ada962654654798a4af - - default default] Security group member updated ['203b95e6-8f62-4037-821a-d64a45daeaf8']#033[00m Feb 20 04:54:22 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:22.961 2 INFO neutron.agent.securitygroups_rpc [None req-ec64d548-962f-41dd-a02f-4a25f3310f4f a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']#033[00m Feb 20 04:54:23 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:23.156 2 INFO neutron.agent.securitygroups_rpc [None req-a78e376a-5faf-4597-9e34-68a60251f328 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['11123030-cb07-4b38-85fd-08bf79b16579']#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.107 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.108 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.127 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.127 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:54:24 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:24.356 2 INFO neutron.agent.securitygroups_rpc [None req-f54c4838-5ce8-45ff-b090-12b4bbbb882f a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']#033[00m Feb 20 04:54:24 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:24.569 2 INFO neutron.agent.securitygroups_rpc [None req-5c523cd2-f346-4abd-990f-316e1d877f9a b9d64681c327441a81dfa771b4b413f6 ce97c44a73f94ada962654654798a4af - - default default] Security group member updated ['203b95e6-8f62-4037-821a-d64a45daeaf8']#033[00m Feb 20 04:54:24 localhost dnsmasq[311368]: exiting on receipt of SIGTERM Feb 20 04:54:24 localhost systemd[1]: tmp-crun.zm6qgB.mount: Deactivated successfully. Feb 20 04:54:24 localhost podman[312038]: 2026-02-20 09:54:24.899510434 +0000 UTC m=+0.057893537 container kill e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:24 localhost systemd[1]: libpod-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0.scope: Deactivated successfully. Feb 20 04:54:24 localhost ovn_controller[156798]: 2026-02-20T09:54:24Z|00188|binding|INFO|Removing iface tap5a613b4f-1d ovn-installed in OVS Feb 20 04:54:24 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:24.906 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f8795842-3b76-4096-9eb1-555fb2e17873 with type ""#033[00m Feb 20 04:54:24 localhost ovn_controller[156798]: 2026-02-20T09:54:24Z|00189|binding|INFO|Removing lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 ovn-installed in OVS Feb 20 04:54:24 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:24.908 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c31c2cb7-7585-4255-8195-898589cc1c5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.909 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:24 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:24.910 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 in datapath 4d3d4b22-89ae-4b72-8269-db16bc023693 unbound from our chassis#033[00m Feb 20 04:54:24 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:24.911 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4d3d4b22-89ae-4b72-8269-db16bc023693 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:24 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:24.912 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e62f88-0a85-4ea3-b308-33a6e69228b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:24 localhost nova_compute[281288]: 2026-02-20 09:54:24.916 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:24 localhost podman[312051]: 2026-02-20 09:54:24.972459957 +0000 UTC m=+0.060730143 container died e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:54:25 localhost podman[312051]: 2026-02-20 09:54:25.003247971 +0000 UTC m=+0.091518107 container cleanup e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 20 04:54:25 localhost systemd[1]: libpod-conmon-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0.scope: Deactivated successfully. Feb 20 04:54:25 localhost podman[312059]: 2026-02-20 09:54:25.057185268 +0000 UTC m=+0.130084538 container remove e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:54:25 localhost kernel: device tap5a613b4f-1d left promiscuous mode Feb 20 04:54:25 localhost nova_compute[281288]: 2026-02-20 09:54:25.069 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:25 localhost nova_compute[281288]: 2026-02-20 09:54:25.084 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:25.119 264355 INFO neutron.agent.dhcp.agent [None req-0316d5ad-e94a-4888-8259-441d1361f8b8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:25 localhost sshd[312084]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:54:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:25.301 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:25 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:25.529 2 INFO neutron.agent.securitygroups_rpc [None req-1b48fcb2-1507-433d-8e69-fc9c0e8a60aa a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']#033[00m Feb 20 04:54:25 localhost ovn_controller[156798]: 2026-02-20T09:54:25Z|00190|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:25 localhost nova_compute[281288]: 2026-02-20 09:54:25.747 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:54:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e124 e124: 6 total, 6 up, 6 in Feb 20 04:54:25 localhost systemd[1]: var-lib-containers-storage-overlay-f09de51b9fc12b68b5d61385aaa744ed437b92490600eef9db4abfcb8fb69f0d-merged.mount: Deactivated successfully. Feb 20 04:54:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0-userdata-shm.mount: Deactivated successfully. Feb 20 04:54:25 localhost systemd[1]: run-netns-qdhcp\x2d4d3d4b22\x2d89ae\x2d4b72\x2d8269\x2ddb16bc023693.mount: Deactivated successfully. Feb 20 04:54:25 localhost podman[312086]: 2026-02-20 09:54:25.90341975 +0000 UTC m=+0.086939658 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:54:25 localhost podman[312086]: 2026-02-20 09:54:25.941401422 +0000 UTC m=+0.124921350 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:54:25 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:54:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:26 localhost openstack_network_exporter[244414]: ERROR 09:54:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:54:26 localhost openstack_network_exporter[244414]: Feb 20 04:54:26 localhost openstack_network_exporter[244414]: ERROR 09:54:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:54:26 localhost openstack_network_exporter[244414]: Feb 20 04:54:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e125 e125: 6 total, 6 up, 6 in Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.687077) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267687211, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1164, "num_deletes": 254, "total_data_size": 1413785, "memory_usage": 1439328, "flush_reason": "Manual Compaction"} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267697112, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 923135, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18936, "largest_seqno": 20095, "table_properties": {"data_size": 918254, "index_size": 2416, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11360, "raw_average_key_size": 20, "raw_value_size": 908237, "raw_average_value_size": 1666, "num_data_blocks": 106, "num_entries": 545, "num_filter_entries": 545, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581202, "oldest_key_time": 1771581202, "file_creation_time": 1771581267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 10083 microseconds, and 5136 cpu microseconds. Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.697174) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 923135 bytes OK Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.697209) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.699147) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.699172) EVENT_LOG_v1 {"time_micros": 1771581267699166, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.699204) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1408047, prev total WAL file size 1408047, number of live WAL files 2. Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.700133) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(901KB)], [27(17MB)] Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267700199, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18824993, "oldest_snapshot_seqno": -1} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12202 keys, 15913972 bytes, temperature: kUnknown Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267774019, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15913972, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15845959, "index_size": 36424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30533, "raw_key_size": 328079, "raw_average_key_size": 26, "raw_value_size": 15639612, "raw_average_value_size": 1281, "num_data_blocks": 1377, "num_entries": 12202, "num_filter_entries": 12202, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.774332) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15913972 bytes Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.776060) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 254.6 rd, 215.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 17.1 +0.0 blob) out(15.2 +0.0 blob), read-write-amplify(37.6) write-amplify(17.2) OK, records in: 12729, records dropped: 527 output_compression: NoCompression Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.776088) EVENT_LOG_v1 {"time_micros": 1771581267776075, "job": 14, "event": "compaction_finished", "compaction_time_micros": 73936, "compaction_time_cpu_micros": 42823, "output_level": 6, "num_output_files": 1, "total_output_size": 15913972, "num_input_records": 12729, "num_output_records": 12202, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267776348, "job": 14, "event": "table_file_deletion", "file_number": 29} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267778837, "job": 14, "event": "table_file_deletion", "file_number": 27} Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.700021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:54:27 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:54:27 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e126 e126: 6 total, 6 up, 6 in Feb 20 04:54:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:54:28 localhost podman[312109]: 2026-02-20 09:54:28.161747585 +0000 UTC m=+0.092809846 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter) Feb 20 04:54:28 localhost podman[312109]: 2026-02-20 09:54:28.179982728 +0000 UTC m=+0.111044999 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9) Feb 20 04:54:28 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:54:28 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:28.781 2 INFO neutron.agent.securitygroups_rpc [None req-3a0ac74b-e97f-4840-a2be-cf3db56b29ba eed45d0e6e9a4013a0e822ffa85bb5cb 13f7a9ed49974d1596cd7746bdf2e7c4 - - default default] Security group rule updated ['92258b95-63d5-4c8a-9734-555bdc627d97']#033[00m Feb 20 04:54:28 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e127 e127: 6 total, 6 up, 6 in Feb 20 04:54:29 localhost nova_compute[281288]: 2026-02-20 09:54:29.163 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:29 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e128 e128: 6 total, 6 up, 6 in Feb 20 04:54:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e129 e129: 6 total, 6 up, 6 in Feb 20 04:54:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:31 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:31.622 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:32 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e130 e130: 6 total, 6 up, 6 in Feb 20 04:54:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e131 e131: 6 total, 6 up, 6 in Feb 20 04:54:34 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:34.083 264355 INFO neutron.agent.linux.ip_lib [None req-09b59930-cf6c-44f0-b792-6736fd28ff7b - - - - - -] Device tapf0ad1ac2-f9 cannot be used as it has no MAC address#033[00m Feb 20 04:54:34 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e132 e132: 6 total, 6 up, 6 in Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:34 localhost kernel: device tapf0ad1ac2-f9 entered promiscuous mode Feb 20 04:54:34 localhost NetworkManager[5988]: [1771581274.1500] manager: (tapf0ad1ac2-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.153 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:34 localhost systemd-udevd[312138]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:54:34 localhost ovn_controller[156798]: 2026-02-20T09:54:34Z|00191|binding|INFO|Claiming lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa for this chassis. Feb 20 04:54:34 localhost ovn_controller[156798]: 2026-02-20T09:54:34Z|00192|binding|INFO|f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa: Claiming unknown Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.167 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:34 localhost ovn_controller[156798]: 2026-02-20T09:54:34Z|00193|binding|INFO|Setting lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa ovn-installed in OVS Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.198 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:34 localhost ovn_controller[156798]: 2026-02-20T09:54:34Z|00194|binding|INFO|Setting lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa up in Southbound Feb 20 04:54:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:34.210 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189489f9-7d29-4419-9e31-fd1fec55fc46, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:34.213 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa in datapath d6b1eef5-3137-454f-8164-8278293c350a bound to our chassis#033[00m Feb 20 04:54:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:34.215 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d6b1eef5-3137-454f-8164-8278293c350a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:34 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:34.216 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[03d05a97-e110-4289-88ad-b7a02bb85872]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.244 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:34 localhost nova_compute[281288]: 2026-02-20 09:54:34.274 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:54:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:54:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e133 e133: 6 total, 6 up, 6 in Feb 20 04:54:35 localhost podman[312186]: 2026-02-20 09:54:35.182209698 +0000 UTC m=+0.115101973 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:35 localhost podman[312186]: 2026-02-20 09:54:35.272967481 +0000 UTC m=+0.205859706 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller) Feb 20 04:54:35 localhost podman[312213]: Feb 20 04:54:35 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:54:35 localhost podman[312213]: 2026-02-20 09:54:35.283669555 +0000 UTC m=+0.129385626 container create 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:35 localhost podman[312187]: 2026-02-20 09:54:35.237153265 +0000 UTC m=+0.165152962 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 20 04:54:35 localhost podman[312187]: 2026-02-20 09:54:35.323072091 +0000 UTC m=+0.251071808 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:54:35 localhost podman[312213]: 2026-02-20 09:54:35.236181315 +0000 UTC m=+0.081897386 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:54:35 localhost systemd[1]: Started libpod-conmon-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c.scope. Feb 20 04:54:35 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:54:35 localhost systemd[1]: Started libcrun container. Feb 20 04:54:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58eee122d4e0dd04c061f43cc5e2ecc11b5cf7d4c3051551afe09a5a3c5f2a8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:54:35 localhost podman[312213]: 2026-02-20 09:54:35.375693548 +0000 UTC m=+0.221409599 container init 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:54:35 localhost podman[312213]: 2026-02-20 09:54:35.38469155 +0000 UTC m=+0.230407601 container start 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:54:35 localhost dnsmasq[312254]: started, version 2.85 cachesize 150 Feb 20 04:54:35 localhost dnsmasq[312254]: DNS service limited to local subnets Feb 20 04:54:35 localhost dnsmasq[312254]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:54:35 localhost dnsmasq[312254]: warning: no upstream servers configured Feb 20 04:54:35 localhost dnsmasq-dhcp[312254]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:54:35 localhost dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 0 addresses Feb 20 04:54:35 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host Feb 20 04:54:35 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts Feb 20 04:54:35 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:35.593 264355 INFO neutron.agent.dhcp.agent [None req-ec2a78f5-7302-40f5-82f5-46c38fc7596e - - - - - -] DHCP configuration for ports {'bb02ac54-616f-471c-81a6-165ac4648db9'} is completed#033[00m Feb 20 04:54:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e134 e134: 6 total, 6 up, 6 in Feb 20 04:54:36 localhost systemd[1]: tmp-crun.BhWUNY.mount: Deactivated successfully. Feb 20 04:54:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e135 e135: 6 total, 6 up, 6 in Feb 20 04:54:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:37.638 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e136 e136: 6 total, 6 up, 6 in Feb 20 04:54:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:37.683 264355 INFO neutron.agent.linux.ip_lib [None req-0019394c-92b8-4fed-9379-791875aaf1b9 - - - - - -] Device tap626e3303-a1 cannot be used as it has no MAC address#033[00m Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.751 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:37 localhost kernel: device tap626e3303-a1 entered promiscuous mode Feb 20 04:54:37 localhost NetworkManager[5988]: [1771581277.7623] manager: (tap626e3303-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Feb 20 04:54:37 localhost ovn_controller[156798]: 2026-02-20T09:54:37Z|00195|binding|INFO|Claiming lport 626e3303-a120-4467-a686-694c8014af7a for this chassis. Feb 20 04:54:37 localhost ovn_controller[156798]: 2026-02-20T09:54:37Z|00196|binding|INFO|626e3303-a120-4467-a686-694c8014af7a: Claiming unknown Feb 20 04:54:37 localhost systemd-udevd[312301]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:37 localhost ovn_controller[156798]: 2026-02-20T09:54:37Z|00197|binding|INFO|Setting lport 626e3303-a120-4467-a686-694c8014af7a ovn-installed in OVS Feb 20 04:54:37 localhost ovn_controller[156798]: 2026-02-20T09:54:37Z|00198|binding|INFO|Setting lport 626e3303-a120-4467-a686-694c8014af7a up in Southbound Feb 20 04:54:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:37.778 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d454f9be-83f3-4c60-8509-6d14a02fd7f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=626e3303-a120-4467-a686-694c8014af7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.780 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.781 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:37.782 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 626e3303-a120-4467-a686-694c8014af7a in datapath 2f1c353d-8de5-4616-8b20-8c686b261d9f bound to our chassis#033[00m Feb 20 04:54:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:37.785 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9fb30275-dfb9-4635-9229-a9a2263cae49 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:54:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:37.786 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f1c353d-8de5-4616-8b20-8c686b261d9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:54:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:37.790 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8b584219-7a80-432b-aae2-b4b948bff54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.807 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.858 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:37 localhost nova_compute[281288]: 2026-02-20 09:54:37.889 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost sshd[312330]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:54:38 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:38 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:38 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:38.564 264355 INFO neutron.agent.linux.ip_lib [None req-5286dd56-94be-439a-8d96-c43df7ef8ddc - - - - - -] Device tap090441cf-8d cannot be used as it has no MAC address#033[00m Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.586 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost kernel: device tap090441cf-8d entered promiscuous mode Feb 20 04:54:38 localhost NetworkManager[5988]: [1771581278.5943] manager: (tap090441cf-8d): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.596 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost ovn_controller[156798]: 2026-02-20T09:54:38Z|00199|binding|INFO|Claiming lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 for this chassis. Feb 20 04:54:38 localhost ovn_controller[156798]: 2026-02-20T09:54:38Z|00200|binding|INFO|090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7: Claiming unknown Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:38.615 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a64fc1d2-8a52-4707-811c-fcf1e047c8d6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:38.617 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 in datapath 5bd7ee4d-003e-46a4-8831-dc1b39078c68 bound to our chassis#033[00m Feb 20 04:54:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:38.619 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bd7ee4d-003e-46a4-8831-dc1b39078c68 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:38 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:38.620 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[435c0c98-7843-406c-969c-a4e9032df93b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:38 localhost ovn_controller[156798]: 2026-02-20T09:54:38Z|00201|binding|INFO|Setting lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 ovn-installed in OVS Feb 20 04:54:38 localhost ovn_controller[156798]: 2026-02-20T09:54:38Z|00202|binding|INFO|Setting lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 up in Southbound Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.641 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.701 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e137 e137: 6 total, 6 up, 6 in Feb 20 04:54:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:38.729 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:38Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bc970bd1-c7c2-4901-b5ca-da9674328188, ip_allocation=immediate, mac_address=fa:16:3e:4d:17:27, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:30Z, description=, dns_domain=, id=d6b1eef5-3137-454f-8164-8278293c350a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-959522799, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20486, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1744, status=ACTIVE, subnets=['35b82aef-e2db-412e-8a11-c85033ae6cf6'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:33Z, vlan_transparent=None, network_id=d6b1eef5-3137-454f-8164-8278293c350a, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1823, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:38Z on network d6b1eef5-3137-454f-8164-8278293c350a#033[00m Feb 20 04:54:38 localhost nova_compute[281288]: 2026-02-20 09:54:38.739 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:38 localhost podman[312446]: Feb 20 04:54:38 localhost sshd[312482]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:54:38 localhost podman[312446]: 2026-02-20 09:54:38.923120723 +0000 UTC m=+0.151346503 container create 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:54:38 localhost podman[312446]: 2026-02-20 09:54:38.82775297 +0000 UTC m=+0.055978760 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:54:38 localhost systemd[1]: Started libpod-conmon-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86.scope. Feb 20 04:54:38 localhost dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 1 addresses Feb 20 04:54:38 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host Feb 20 04:54:38 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts Feb 20 04:54:38 localhost podman[312484]: 2026-02-20 09:54:38.979261896 +0000 UTC m=+0.070264133 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:39 localhost systemd[1]: Started libcrun container. Feb 20 04:54:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58f04d840f8baaa0f71bc8773e36a918813b860415af52dc0d268b64c80757aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:54:39 localhost podman[312446]: 2026-02-20 09:54:39.013924608 +0000 UTC m=+0.242150378 container init 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:54:39 localhost podman[312446]: 2026-02-20 09:54:39.025703696 +0000 UTC m=+0.253929446 container start 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:54:39 localhost dnsmasq[312526]: started, version 2.85 cachesize 150 Feb 20 04:54:39 localhost dnsmasq[312526]: DNS service limited to local subnets Feb 20 04:54:39 localhost dnsmasq[312526]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:54:39 localhost dnsmasq[312526]: warning: no upstream servers configured Feb 20 04:54:39 localhost dnsmasq-dhcp[312526]: DHCP, static leases only on 10.101.0.0, lease time 1d Feb 20 04:54:39 localhost dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 0 addresses Feb 20 04:54:39 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host Feb 20 04:54:39 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts Feb 20 04:54:39 localhost nova_compute[281288]: 2026-02-20 09:54:39.204 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:54:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:39.211 264355 INFO neutron.agent.dhcp.agent [None req-3e8063b1-0647-422d-99c5-413d5e36a215 - - - - - -] DHCP configuration for ports {'ba50db7a-b774-4bda-9089-f43c5b7188aa'} is completed#033[00m Feb 20 04:54:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:39.464 264355 INFO neutron.agent.dhcp.agent [None req-a611ad1f-9bb6-4df4-aa6f-29453444ecfc - - - - - -] DHCP configuration for ports {'bc970bd1-c7c2-4901-b5ca-da9674328188'} is completed#033[00m Feb 20 04:54:39 localhost podman[312585]: Feb 20 04:54:39 localhost podman[312585]: 2026-02-20 09:54:39.66758687 +0000 UTC m=+0.092525739 container create f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 20 04:54:39 localhost systemd[1]: Started libpod-conmon-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e.scope. Feb 20 04:54:39 localhost podman[312585]: 2026-02-20 09:54:39.624704029 +0000 UTC m=+0.049642898 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:54:39 localhost systemd[1]: Started libcrun container. Feb 20 04:54:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/827134bce1983498aa5eb328de0e6b4c29e8835a5007ea6118894d8813f2ece8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:54:39 localhost podman[312585]: 2026-02-20 09:54:39.741907565 +0000 UTC m=+0.166846394 container init f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:39 localhost podman[312585]: 2026-02-20 09:54:39.751428623 +0000 UTC m=+0.176367442 container start f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 20 04:54:39 localhost dnsmasq[312603]: started, version 2.85 cachesize 150 Feb 20 04:54:39 localhost dnsmasq[312603]: DNS service limited to local subnets Feb 20 04:54:39 localhost dnsmasq[312603]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:54:39 localhost dnsmasq[312603]: warning: no upstream servers configured Feb 20 04:54:39 localhost dnsmasq-dhcp[312603]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:54:39 localhost dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 0 addresses Feb 20 04:54:39 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host Feb 20 04:54:39 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts Feb 20 04:54:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:39.921 264355 INFO neutron.agent.dhcp.agent [None req-09be6d0d-2407-4c31-a111-375f4e7015bf - - - - - -] DHCP configuration for ports {'99bc2d50-f443-4862-87f7-b71e29668ed0'} is completed#033[00m Feb 20 04:54:40 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:40.155 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:39Z, description=, device_id=112ff0d9-32cb-4021-a1c4-20c4dbfa4300, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d763ba6c-a99a-441a-a009-8982c2eb3ef5, ip_allocation=immediate, mac_address=fa:16:3e:33:11:a0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:35Z, description=, dns_domain=, id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1153125537, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1798, status=ACTIVE, subnets=['3c327061-4348-4983-bd3a-abe5ea80ff0e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:37Z, vlan_transparent=None, network_id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1832, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:39Z on network 5bd7ee4d-003e-46a4-8831-dc1b39078c68#033[00m Feb 20 04:54:40 localhost dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 1 addresses Feb 20 04:54:40 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host Feb 20 04:54:40 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts Feb 20 04:54:40 localhost podman[312620]: 2026-02-20 09:54:40.355701066 +0000 UTC m=+0.067193229 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:54:40 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:40.576 264355 INFO neutron.agent.dhcp.agent [None req-0a4937a7-4a0d-44d6-9a86-965f292ccade - - - - - -] DHCP configuration for ports {'d763ba6c-a99a-441a-a009-8982c2eb3ef5'} is completed#033[00m Feb 20 04:54:40 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:40.743 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:38Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bc970bd1-c7c2-4901-b5ca-da9674328188, ip_allocation=immediate, mac_address=fa:16:3e:4d:17:27, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:30Z, description=, dns_domain=, id=d6b1eef5-3137-454f-8164-8278293c350a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-959522799, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20486, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1744, status=ACTIVE, subnets=['35b82aef-e2db-412e-8a11-c85033ae6cf6'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:33Z, vlan_transparent=None, network_id=d6b1eef5-3137-454f-8164-8278293c350a, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1823, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:38Z on network d6b1eef5-3137-454f-8164-8278293c350a#033[00m Feb 20 04:54:40 localhost podman[312657]: 2026-02-20 09:54:40.994495447 +0000 UTC m=+0.063127057 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:40 localhost dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 1 addresses Feb 20 04:54:40 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host Feb 20 04:54:40 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts Feb 20 04:54:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:41.261 264355 INFO neutron.agent.dhcp.agent [None req-086cf761-f22a-46ed-9f9f-3a174ca6583d - - - - - -] DHCP configuration for ports {'bc970bd1-c7c2-4901-b5ca-da9674328188'} is completed#033[00m Feb 20 04:54:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:41.499 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:39Z, description=, device_id=112ff0d9-32cb-4021-a1c4-20c4dbfa4300, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d763ba6c-a99a-441a-a009-8982c2eb3ef5, ip_allocation=immediate, mac_address=fa:16:3e:33:11:a0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:35Z, description=, dns_domain=, id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1153125537, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1798, status=ACTIVE, subnets=['3c327061-4348-4983-bd3a-abe5ea80ff0e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:37Z, vlan_transparent=None, network_id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1832, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:39Z on network 5bd7ee4d-003e-46a4-8831-dc1b39078c68#033[00m Feb 20 04:54:41 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:41.593 2 INFO neutron.agent.securitygroups_rpc [None req-eb743b9b-8319-4b6c-9522-759dec99d8f5 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:54:41 localhost dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 1 addresses Feb 20 04:54:41 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host Feb 20 04:54:41 localhost podman[312695]: 2026-02-20 09:54:41.697414722 +0000 UTC m=+0.065968983 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:54:41 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts Feb 20 04:54:41 localhost systemd[1]: tmp-crun.rEv27L.mount: Deactivated successfully. Feb 20 04:54:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:41.919 264355 INFO neutron.agent.dhcp.agent [None req-134f6b14-a764-4e56-9a96-d6340411d015 - - - - - -] DHCP configuration for ports {'d763ba6c-a99a-441a-a009-8982c2eb3ef5'} is completed#033[00m Feb 20 04:54:42 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:42.259 2 INFO neutron.agent.securitygroups_rpc [None req-b2b645bc-d759-4ba7-b30a-1010fa24d49e 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:54:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:42.259 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:41Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=931018ff-773f-42ec-b764-15a4ad18e505, ip_allocation=immediate, mac_address=fa:16:3e:7e:b4:55, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:31Z, description=, dns_domain=, id=2f1c353d-8de5-4616-8b20-8c686b261d9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-571545747, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38343, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1765, status=ACTIVE, subnets=['2b7f7d2f-8d57-4c58-a9a5-42e3b231614b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:35Z, vlan_transparent=None, network_id=2f1c353d-8de5-4616-8b20-8c686b261d9f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:42Z on network 2f1c353d-8de5-4616-8b20-8c686b261d9f#033[00m Feb 20 04:54:42 localhost podman[312733]: 2026-02-20 09:54:42.466895818 +0000 UTC m=+0.057869747 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 20 04:54:42 localhost dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 1 addresses Feb 20 04:54:42 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host Feb 20 04:54:42 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts Feb 20 04:54:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:54:42 localhost podman[312748]: 2026-02-20 09:54:42.579793903 +0000 UTC m=+0.085753313 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute) Feb 20 04:54:42 localhost podman[312748]: 2026-02-20 09:54:42.615172416 +0000 UTC m=+0.121131846 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:54:42 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:54:42 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e138 e138: 6 total, 6 up, 6 in Feb 20 04:54:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:42.761 264355 INFO neutron.agent.dhcp.agent [None req-d780b5c2-caee-45a1-9727-7cc1f4f29673 - - - - - -] DHCP configuration for ports {'931018ff-773f-42ec-b764-15a4ad18e505'} is completed#033[00m Feb 20 04:54:43 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:43.822 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:41Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=931018ff-773f-42ec-b764-15a4ad18e505, ip_allocation=immediate, mac_address=fa:16:3e:7e:b4:55, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:31Z, description=, dns_domain=, id=2f1c353d-8de5-4616-8b20-8c686b261d9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-571545747, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38343, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1765, status=ACTIVE, subnets=['2b7f7d2f-8d57-4c58-a9a5-42e3b231614b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:35Z, vlan_transparent=None, network_id=2f1c353d-8de5-4616-8b20-8c686b261d9f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:42Z on network 2f1c353d-8de5-4616-8b20-8c686b261d9f#033[00m Feb 20 04:54:44 localhost dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 1 addresses Feb 20 04:54:44 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host Feb 20 04:54:44 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts Feb 20 04:54:44 localhost podman[312788]: 2026-02-20 09:54:44.061762995 +0000 UTC m=+0.069383676 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:54:44 localhost nova_compute[281288]: 2026-02-20 09:54:44.206 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:54:44 localhost nova_compute[281288]: 2026-02-20 09:54:44.208 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:54:44 localhost nova_compute[281288]: 2026-02-20 09:54:44.208 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:54:44 localhost nova_compute[281288]: 2026-02-20 09:54:44.209 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:54:44 localhost nova_compute[281288]: 2026-02-20 09:54:44.241 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:44 localhost nova_compute[281288]: 2026-02-20 09:54:44.241 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:54:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:44.304 264355 INFO neutron.agent.dhcp.agent [None req-95e5c5c7-e3a2-4786-bb3a-add350ccd39a - - - - - -] DHCP configuration for ports {'931018ff-773f-42ec-b764-15a4ad18e505'} is completed#033[00m Feb 20 04:54:44 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:54:45 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e139 e139: 6 total, 6 up, 6 in Feb 20 04:54:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:46 localhost dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 0 addresses Feb 20 04:54:46 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host Feb 20 04:54:46 localhost dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts Feb 20 04:54:46 localhost podman[312826]: 2026-02-20 09:54:46.4255235 +0000 UTC m=+0.060058433 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true) Feb 20 04:54:46 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:46.612 264355 INFO neutron.agent.linux.ip_lib [None req-bd819d36-1606-418f-9829-aaeaae7eb8a0 - - - - - -] Device tap3f9955f7-a5 cannot be used as it has no MAC address#033[00m Feb 20 04:54:46 localhost nova_compute[281288]: 2026-02-20 09:54:46.637 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:46 localhost kernel: device tap3f9955f7-a5 entered promiscuous mode Feb 20 04:54:46 localhost ovn_controller[156798]: 2026-02-20T09:54:46Z|00203|binding|INFO|Claiming lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 for this chassis. Feb 20 04:54:46 localhost ovn_controller[156798]: 2026-02-20T09:54:46Z|00204|binding|INFO|3f9955f7-a558-438e-b36a-2dfdb0fa6f03: Claiming unknown Feb 20 04:54:46 localhost NetworkManager[5988]: [1771581286.6460] manager: (tap3f9955f7-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Feb 20 04:54:46 localhost nova_compute[281288]: 2026-02-20 09:54:46.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:46 localhost systemd-udevd[312858]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:54:46 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:46.657 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7365eb83b07c4401a5a58afb3f122ce5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5653766c-f9f7-4ea8-b60b-d59b52335179, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3f9955f7-a558-438e-b36a-2dfdb0fa6f03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:46 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:46.659 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 in datapath c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 bound to our chassis#033[00m Feb 20 04:54:46 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:46.661 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:46 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:46.662 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c4de329f-6eb6-46bf-827e-1913efa5eb92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost ovn_controller[156798]: 2026-02-20T09:54:46Z|00205|binding|INFO|Setting lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 ovn-installed in OVS Feb 20 04:54:46 localhost ovn_controller[156798]: 2026-02-20T09:54:46Z|00206|binding|INFO|Setting lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 up in Southbound Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost nova_compute[281288]: 2026-02-20 09:54:46.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost journal[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device Feb 20 04:54:46 localhost nova_compute[281288]: 2026-02-20 09:54:46.735 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:46 localhost nova_compute[281288]: 2026-02-20 09:54:46.763 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:46 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:46.912 2 INFO neutron.agent.securitygroups_rpc [None req-bce28f21-3f23-462f-a383-948742167547 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:54:47 localhost ovn_controller[156798]: 2026-02-20T09:54:47Z|00207|binding|INFO|Releasing lport 626e3303-a120-4467-a686-694c8014af7a from this chassis (sb_readonly=0) Feb 20 04:54:47 localhost ovn_controller[156798]: 2026-02-20T09:54:47Z|00208|binding|INFO|Setting lport 626e3303-a120-4467-a686-694c8014af7a down in Southbound Feb 20 04:54:47 localhost nova_compute[281288]: 2026-02-20 09:54:47.067 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:47 localhost kernel: device tap626e3303-a1 left promiscuous mode Feb 20 04:54:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:47.075 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d454f9be-83f3-4c60-8509-6d14a02fd7f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=626e3303-a120-4467-a686-694c8014af7a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:47.077 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 626e3303-a120-4467-a686-694c8014af7a in datapath 2f1c353d-8de5-4616-8b20-8c686b261d9f unbound from our chassis#033[00m Feb 20 04:54:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:47.080 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f1c353d-8de5-4616-8b20-8c686b261d9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:54:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:47.081 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3bf18d-8d9b-4e0e-bc55-f74a528fb939]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:47 localhost nova_compute[281288]: 2026-02-20 09:54:47.090 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:47 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:47.554 2 INFO neutron.agent.securitygroups_rpc [None req-4cd6b0db-c578-42d5-a167-b93bcbfe0117 d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']#033[00m Feb 20 04:54:47 localhost podman[312930]: Feb 20 04:54:47 localhost podman[312930]: 2026-02-20 09:54:47.581615644 +0000 UTC m=+0.066932841 container create 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:54:47 localhost systemd[1]: Started libpod-conmon-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6.scope. Feb 20 04:54:47 localhost systemd[1]: Started libcrun container. Feb 20 04:54:47 localhost podman[312930]: 2026-02-20 09:54:47.555880974 +0000 UTC m=+0.041198151 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:54:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49439000619e910f9d57f4aa05bb363f13b2b24373c80bde35cfff75a4113f51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:54:47 localhost podman[312930]: 2026-02-20 09:54:47.667107419 +0000 UTC m=+0.152424626 container init 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:54:47 localhost podman[312930]: 2026-02-20 09:54:47.676792902 +0000 UTC m=+0.162110099 container start 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:47 localhost dnsmasq[312949]: started, version 2.85 cachesize 150 Feb 20 04:54:47 localhost dnsmasq[312949]: DNS service limited to local subnets Feb 20 04:54:47 localhost dnsmasq[312949]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:54:47 localhost dnsmasq[312949]: warning: no upstream servers configured Feb 20 04:54:47 localhost dnsmasq-dhcp[312949]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:54:47 localhost dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 0 addresses Feb 20 04:54:47 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host Feb 20 04:54:47 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts Feb 20 04:54:47 localhost podman[241968]: time="2026-02-20T09:54:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:54:47 localhost podman[241968]: @ - - [20/Feb/2026:09:54:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162538 "" "Go-http-client/1.1" Feb 20 04:54:47 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:47.740 264355 INFO neutron.agent.dhcp.agent [None req-bd819d36-1606-418f-9829-aaeaae7eb8a0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=21b880c8-0671-49bf-908b-cd75a9b606ab, ip_allocation=immediate, mac_address=fa:16:3e:07:f9:a2, name=tempest-ExtraDHCPOptionsIpV6TestJSON-528365005, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:44Z, description=, dns_domain=, id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1165484479, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42841, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1866, status=ACTIVE, subnets=['271aec3b-f42d-4679-a514-3cc525446f17'], tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:45Z, vlan_transparent=None, network_id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['921ea913-835d-427d-a3f2-d35699dcd043'], standard_attr_id=1888, status=DOWN, tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:47Z on network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1#033[00m Feb 20 04:54:47 localhost podman[241968]: @ - - [20/Feb/2026:09:54:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20237 "" "Go-http-client/1.1" Feb 20 04:54:47 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:47.895 264355 INFO neutron.agent.dhcp.agent [None req-2b239739-ba0e-4120-bee5-726985ba6e06 - - - - - -] DHCP configuration for ports {'728721aa-7c0e-449a-a140-7666ef1d7539'} is completed#033[00m Feb 20 04:54:47 localhost dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 1 addresses Feb 20 04:54:47 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host Feb 20 04:54:47 localhost podman[312970]: 2026-02-20 09:54:47.995572994 +0000 UTC m=+0.116290170 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:54:47 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts Feb 20 04:54:48 localhost dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 0 addresses Feb 20 04:54:48 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host Feb 20 04:54:48 localhost podman[312994]: 2026-02-20 09:54:48.058719959 +0000 UTC m=+0.098445848 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:54:48 localhost dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts Feb 20 04:54:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:48.256 264355 INFO neutron.agent.dhcp.agent [None req-a6bdfe51-e495-4b8f-a2e2-1d39131de0e8 - - - - - -] DHCP configuration for ports {'21b880c8-0671-49bf-908b-cd75a9b606ab'} is completed#033[00m Feb 20 04:54:48 localhost ovn_controller[156798]: 2026-02-20T09:54:48Z|00209|binding|INFO|Releasing lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 from this chassis (sb_readonly=0) Feb 20 04:54:48 localhost kernel: device tap090441cf-8d left promiscuous mode Feb 20 04:54:48 localhost ovn_controller[156798]: 2026-02-20T09:54:48Z|00210|binding|INFO|Setting lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 down in Southbound Feb 20 04:54:48 localhost nova_compute[281288]: 2026-02-20 09:54:48.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:48 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:48.305 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a64fc1d2-8a52-4707-811c-fcf1e047c8d6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:48 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:48.307 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 in datapath 5bd7ee4d-003e-46a4-8831-dc1b39078c68 unbound from our chassis#033[00m Feb 20 04:54:48 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:48.309 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bd7ee4d-003e-46a4-8831-dc1b39078c68 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:48 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:48.310 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a9a5b5-53f4-4f3c-9f94-601e3833dc04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:48 localhost nova_compute[281288]: 2026-02-20 09:54:48.313 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:48 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:48.441 2 INFO neutron.agent.securitygroups_rpc [None req-8cc87b89-3410-419f-80d0-39ae3addedda f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:54:48 localhost dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 0 addresses Feb 20 04:54:48 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host Feb 20 04:54:48 localhost podman[313043]: 2026-02-20 09:54:48.671236033 +0000 UTC m=+0.060619981 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:48 localhost dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts Feb 20 04:54:48 localhost ovn_controller[156798]: 2026-02-20T09:54:48Z|00211|binding|INFO|Releasing lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa from this chassis (sb_readonly=0) Feb 20 04:54:48 localhost ovn_controller[156798]: 2026-02-20T09:54:48Z|00212|binding|INFO|Setting lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa down in Southbound Feb 20 04:54:48 localhost kernel: device tapf0ad1ac2-f9 left promiscuous mode Feb 20 04:54:48 localhost nova_compute[281288]: 2026-02-20 09:54:48.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:49 localhost nova_compute[281288]: 2026-02-20 09:54:49.009 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:49.075 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189489f9-7d29-4419-9e31-fd1fec55fc46, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:49.077 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa in datapath d6b1eef5-3137-454f-8164-8278293c350a unbound from our chassis#033[00m Feb 20 04:54:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:49.081 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6b1eef5-3137-454f-8164-8278293c350a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:54:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:49.082 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[05e91604-94f4-48b9-844e-9d095f79daff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:49 localhost nova_compute[281288]: 2026-02-20 09:54:49.242 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:49 localhost nova_compute[281288]: 2026-02-20 09:54:49.244 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:49 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:49.522 2 INFO neutron.agent.securitygroups_rpc [None req-fe10c928-22e1-439f-ab59-773382d09580 d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']#033[00m Feb 20 04:54:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.858 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:48Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=4d0355d6-dfb2-4565-94cc-eebabb872f93, ip_allocation=immediate, mac_address=fa:16:3e:eb:1d:72, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1482054331, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:44Z, description=, dns_domain=, id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1165484479, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42841, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1866, status=ACTIVE, subnets=['271aec3b-f42d-4679-a514-3cc525446f17'], tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:45Z, vlan_transparent=None, network_id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['921ea913-835d-427d-a3f2-d35699dcd043'], standard_attr_id=1891, status=DOWN, tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:48Z on network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1#033[00m Feb 20 04:54:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.883 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Feb 20 04:54:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.883 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Feb 20 04:54:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.884 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Feb 20 04:54:50 localhost dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 2 addresses Feb 20 04:54:50 localhost podman[313082]: 2026-02-20 09:54:50.052701974 +0000 UTC m=+0.058815365 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:54:50 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host Feb 20 04:54:50 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts Feb 20 04:54:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:50.373 264355 INFO neutron.agent.dhcp.agent [None req-dcc37af5-f4a6-45fc-a6d2-2db6a24de490 - - - - - -] DHCP configuration for ports {'4d0355d6-dfb2-4565-94cc-eebabb872f93'} is completed#033[00m Feb 20 04:54:51 localhost podman[313119]: 2026-02-20 09:54:51.244286726 +0000 UTC m=+0.063482697 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:54:51 localhost dnsmasq[312603]: exiting on receipt of SIGTERM Feb 20 04:54:51 localhost systemd[1]: libpod-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e.scope: Deactivated successfully. Feb 20 04:54:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:51 localhost podman[313134]: 2026-02-20 09:54:51.322911432 +0000 UTC m=+0.059752145 container died f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:54:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e-userdata-shm.mount: Deactivated successfully. Feb 20 04:54:51 localhost podman[313134]: 2026-02-20 09:54:51.355328915 +0000 UTC m=+0.092169588 container cleanup f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:54:51 localhost systemd[1]: libpod-conmon-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e.scope: Deactivated successfully. Feb 20 04:54:51 localhost podman[313133]: 2026-02-20 09:54:51.394593846 +0000 UTC m=+0.128502970 container remove f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e140 e140: 6 total, 6 up, 6 in Feb 20 04:54:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:51.943 264355 INFO neutron.agent.dhcp.agent [None req-78dd64e3-d426-4875-b095-eae114cb9051 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:52.142 2 INFO neutron.agent.securitygroups_rpc [None req-ef251e71-dc68-4712-81f3-aa7e64c344fa d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']#033[00m Feb 20 04:54:52 localhost systemd[1]: var-lib-containers-storage-overlay-827134bce1983498aa5eb328de0e6b4c29e8835a5007ea6118894d8813f2ece8-merged.mount: Deactivated successfully. Feb 20 04:54:52 localhost systemd[1]: run-netns-qdhcp\x2d5bd7ee4d\x2d003e\x2d46a4\x2d8831\x2ddc1b39078c68.mount: Deactivated successfully. Feb 20 04:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:54:52 localhost podman[313160]: 2026-02-20 09:54:52.362366838 +0000 UTC m=+0.088181087 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:54:52 localhost podman[313160]: 2026-02-20 09:54:52.37033263 +0000 UTC m=+0.096146889 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:54:52 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:54:52 localhost dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 1 addresses Feb 20 04:54:52 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host Feb 20 04:54:52 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts Feb 20 04:54:52 localhost podman[313200]: 2026-02-20 09:54:52.518766063 +0000 UTC m=+0.062867989 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:54:52 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:52.925 2 INFO neutron.agent.securitygroups_rpc [None req-ffde2a61-a3bf-4d03-bb6c-67c1d37d15bf f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:54:53 localhost dnsmasq[312526]: exiting on receipt of SIGTERM Feb 20 04:54:53 localhost podman[313237]: 2026-02-20 09:54:53.024360442 +0000 UTC m=+0.046409620 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:54:53 localhost systemd[1]: libpod-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86.scope: Deactivated successfully. Feb 20 04:54:53 localhost podman[313251]: 2026-02-20 09:54:53.09813421 +0000 UTC m=+0.058575269 container died 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:54:53 localhost podman[313251]: 2026-02-20 09:54:53.126957105 +0000 UTC m=+0.087398124 container cleanup 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:54:53 localhost systemd[1]: libpod-conmon-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86.scope: Deactivated successfully. Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.152 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:53 localhost podman[313253]: 2026-02-20 09:54:53.181110397 +0000 UTC m=+0.133380058 container remove 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:54:53 localhost systemd[1]: var-lib-containers-storage-overlay-58f04d840f8baaa0f71bc8773e36a918813b860415af52dc0d268b64c80757aa-merged.mount: Deactivated successfully. Feb 20 04:54:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86-userdata-shm.mount: Deactivated successfully. Feb 20 04:54:53 localhost ovn_controller[156798]: 2026-02-20T09:54:53Z|00213|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.485 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=21b880c8-0671-49bf-908b-cd75a9b606ab, ip_allocation=immediate, mac_address=fa:16:3e:07:f9:a2, name=tempest-new-port-name-1177023162, network_id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['921ea913-835d-427d-a3f2-d35699dcd043'], standard_attr_id=1888, status=DOWN, tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:53Z on network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1#033[00m Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.503 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.503 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.504 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Feb 20 04:54:53 localhost systemd[1]: run-netns-qdhcp\x2d2f1c353d\x2d8de5\x2d4616\x2d8b20\x2d8c686b261d9f.mount: Deactivated successfully. Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.519 264355 INFO neutron.agent.dhcp.agent [None req-015c8f29-66f1-4cdf-a119-07fa3585e028 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:53 localhost nova_compute[281288]: 2026-02-20 09:54:53.520 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:53 localhost dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 1 addresses Feb 20 04:54:53 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host Feb 20 04:54:53 localhost podman[313299]: 2026-02-20 09:54:53.696278577 +0000 UTC m=+0.065260300 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:54:53 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts Feb 20 04:54:53 localhost dnsmasq[312254]: exiting on receipt of SIGTERM Feb 20 04:54:53 localhost podman[313332]: 2026-02-20 09:54:53.856953231 +0000 UTC m=+0.068469877 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:54:53 localhost systemd[1]: libpod-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c.scope: Deactivated successfully. Feb 20 04:54:53 localhost podman[313349]: 2026-02-20 09:54:53.930164152 +0000 UTC m=+0.056433573 container died 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:53 localhost podman[313349]: 2026-02-20 09:54:53.96304799 +0000 UTC m=+0.089317421 container cleanup 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 20 04:54:53 localhost systemd[1]: libpod-conmon-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c.scope: Deactivated successfully. Feb 20 04:54:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.980 264355 INFO neutron.agent.dhcp.agent [None req-3cf73f7f-151c-4906-9f4f-8a81f24b7384 - - - - - -] DHCP configuration for ports {'21b880c8-0671-49bf-908b-cd75a9b606ab'} is completed#033[00m Feb 20 04:54:54 localhost podman[313351]: 2026-02-20 09:54:54.01048619 +0000 UTC m=+0.129995685 container remove 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:54:54 localhost systemd[1]: var-lib-containers-storage-overlay-58eee122d4e0dd04c061f43cc5e2ecc11b5cf7d4c3051551afe09a5a3c5f2a8d-merged.mount: Deactivated successfully. Feb 20 04:54:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c-userdata-shm.mount: Deactivated successfully. Feb 20 04:54:54 localhost nova_compute[281288]: 2026-02-20 09:54:54.245 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:54 localhost nova_compute[281288]: 2026-02-20 09:54:54.248 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:54 localhost systemd[1]: run-netns-qdhcp\x2dd6b1eef5\x2d3137\x2d454f\x2d8164\x2d8278293c350a.mount: Deactivated successfully. Feb 20 04:54:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:54.339 264355 INFO neutron.agent.dhcp.agent [None req-df89c7ed-7efc-4420-a977-ea6a863dcdba - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:54 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:54.597 2 INFO neutron.agent.securitygroups_rpc [None req-3a014f90-3d7f-4958-9a67-4182f974566f 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']#033[00m Feb 20 04:54:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:54.766 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:54 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:54.960 2 INFO neutron.agent.securitygroups_rpc [None req-208eb96e-53cb-409e-b54b-a8915a18b91f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:54:55 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:55.016 2 INFO neutron.agent.securitygroups_rpc [None req-d0404fc6-f217-496c-b11d-08ac05ffdfb7 d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']#033[00m Feb 20 04:54:55 localhost systemd[1]: tmp-crun.UXs7YT.mount: Deactivated successfully. Feb 20 04:54:55 localhost dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 0 addresses Feb 20 04:54:55 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host Feb 20 04:54:55 localhost podman[313393]: 2026-02-20 09:54:55.367092898 +0000 UTC m=+0.073074448 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:54:55 localhost dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts Feb 20 04:54:55 localhost nova_compute[281288]: 2026-02-20 09:54:55.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:55 localhost nova_compute[281288]: 2026-02-20 09:54:55.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:54:55 localhost nova_compute[281288]: 2026-02-20 09:54:55.760 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:54:55 localhost nova_compute[281288]: 2026-02-20 09:54:55.760 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:54:55 localhost nova_compute[281288]: 2026-02-20 09:54:55.761 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:54:55 localhost nova_compute[281288]: 2026-02-20 09:54:55.762 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:54:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:54:56 localhost podman[313433]: 2026-02-20 09:54:56.143349329 +0000 UTC m=+0.082795063 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:54:56 localhost podman[313433]: 2026-02-20 09:54:56.178299349 +0000 UTC m=+0.117745023 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:54:56 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:54:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:54:56 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3685988020' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.236 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:54:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.317 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.321 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:54:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:56.501 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:56 localhost dnsmasq[312949]: exiting on receipt of SIGTERM Feb 20 04:54:56 localhost systemd[1]: tmp-crun.XUDdAV.mount: Deactivated successfully. Feb 20 04:54:56 localhost podman[313475]: 2026-02-20 09:54:56.513957703 +0000 UTC m=+0.081180064 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 20 04:54:56 localhost systemd[1]: libpod-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6.scope: Deactivated successfully. Feb 20 04:54:56 localhost openstack_network_exporter[244414]: ERROR 09:54:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:54:56 localhost openstack_network_exporter[244414]: Feb 20 04:54:56 localhost openstack_network_exporter[244414]: ERROR 09:54:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:54:56 localhost openstack_network_exporter[244414]: Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.596 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:54:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:56.601 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.601 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11357MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.603 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:54:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:56.603 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.603 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.605 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:56 localhost podman[313495]: 2026-02-20 09:54:56.606591614 +0000 UTC m=+0.065832619 container died 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 20 04:54:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6-userdata-shm.mount: Deactivated successfully. Feb 20 04:54:56 localhost podman[313495]: 2026-02-20 09:54:56.65162792 +0000 UTC m=+0.110868875 container remove 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:54:56 localhost ovn_controller[156798]: 2026-02-20T09:54:56Z|00214|binding|INFO|Releasing lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 from this chassis (sb_readonly=0) Feb 20 04:54:56 localhost ovn_controller[156798]: 2026-02-20T09:54:56Z|00215|binding|INFO|Setting lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 down in Southbound Feb 20 04:54:56 localhost kernel: device tap3f9955f7-a5 left promiscuous mode Feb 20 04:54:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:56.664 2 INFO neutron.agent.securitygroups_rpc [None req-36db6997-ce64-4f72-86bb-117bda3b0094 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:56.679 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7365eb83b07c4401a5a58afb3f122ce5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5653766c-f9f7-4ea8-b60b-d59b52335179, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3f9955f7-a558-438e-b36a-2dfdb0fa6f03) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:56.681 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 in datapath c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 unbound from our chassis#033[00m Feb 20 04:54:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:56.684 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:56.685 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[65ffcdf2-71f4-4ef8-8cb9-c9003a38422b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.691 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.693 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:56 localhost systemd[1]: libpod-conmon-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6.scope: Deactivated successfully. Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.707 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:54:56 localhost nova_compute[281288]: 2026-02-20 09:54:56.764 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:54:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:54:57 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1996257721' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:54:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:57.272 264355 INFO neutron.agent.dhcp.agent [None req-fa7c5aed-dcab-4d30-acac-b168cc8df1ee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:57.272 264355 INFO neutron.agent.dhcp.agent [None req-fa7c5aed-dcab-4d30-acac-b168cc8df1ee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:57 localhost nova_compute[281288]: 2026-02-20 09:54:57.273 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:54:57 localhost nova_compute[281288]: 2026-02-20 09:54:57.278 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:54:57 localhost nova_compute[281288]: 2026-02-20 09:54:57.292 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:54:57 localhost nova_compute[281288]: 2026-02-20 09:54:57.293 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:54:57 localhost nova_compute[281288]: 2026-02-20 09:54:57.293 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:54:57 localhost systemd[1]: var-lib-containers-storage-overlay-49439000619e910f9d57f4aa05bb363f13b2b24373c80bde35cfff75a4113f51-merged.mount: Deactivated successfully. Feb 20 04:54:57 localhost systemd[1]: run-netns-qdhcp\x2dc4f5fcc5\x2deb5a\x2d46dc\x2db828\x2d35cec3b2f1e1.mount: Deactivated successfully. Feb 20 04:54:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e141 e141: 6 total, 6 up, 6 in Feb 20 04:54:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:57.772 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:54:57 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:57.781 2 INFO neutron.agent.securitygroups_rpc [None req-a1f738f8-2f94-4ebf-b02a-59bcf9971aeb 51a4789e7d0b404b9882e0c26f7229be 1c44e13adebb4610b7c0cd2fdc62a5b7 - - default default] Security group member updated ['000c42d1-648a-4f56-b7e6-024a1e270fb9']#033[00m Feb 20 04:54:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:58.002 264355 INFO neutron.agent.linux.ip_lib [None req-a0184e0e-748b-48ec-b6a6-b11d207ba4c4 - - - - - -] Device tapea1cc9c8-ce cannot be used as it has no MAC address#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost kernel: device tapea1cc9c8-ce entered promiscuous mode Feb 20 04:54:58 localhost ovn_controller[156798]: 2026-02-20T09:54:58Z|00216|binding|INFO|Claiming lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe for this chassis. Feb 20 04:54:58 localhost ovn_controller[156798]: 2026-02-20T09:54:58Z|00217|binding|INFO|ea1cc9c8-cec9-46f8-a9aa-5542762442fe: Claiming unknown Feb 20 04:54:58 localhost NetworkManager[5988]: [1771581298.0484] manager: (tapea1cc9c8-ce): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost systemd-udevd[313548]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:54:58 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:58.064 2 INFO neutron.agent.securitygroups_rpc [None req-865b7338-5884-4631-b540-e349cbd12cfd f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:54:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:58.069 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa788937-4ffe-4167-9bda-a66bb7ab07d7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ea1cc9c8-cec9-46f8-a9aa-5542762442fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:54:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:58.071 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ea1cc9c8-cec9-46f8-a9aa-5542762442fe in datapath 2c7c971d-607d-4f86-ac60-49a788864bee bound to our chassis#033[00m Feb 20 04:54:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:58.075 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2c7c971d-607d-4f86-ac60-49a788864bee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:54:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:54:58.076 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[86931df8-1654-4cda-8801-e336e8c6c53b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.100 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost ovn_controller[156798]: 2026-02-20T09:54:58Z|00218|binding|INFO|Setting lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe ovn-installed in OVS Feb 20 04:54:58 localhost ovn_controller[156798]: 2026-02-20T09:54:58Z|00219|binding|INFO|Setting lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe up in Southbound Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.106 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.136 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.166 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.293 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.294 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:58 localhost ovn_controller[156798]: 2026-02-20T09:54:58Z|00220|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.454 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:58 localhost nova_compute[281288]: 2026-02-20 09:54:58.724 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:54:59 localhost podman[313603]: Feb 20 04:54:59 localhost podman[313609]: 2026-02-20 09:54:59.163743615 +0000 UTC m=+0.096283523 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1770267347) Feb 20 04:54:59 localhost podman[313609]: 2026-02-20 09:54:59.17612637 +0000 UTC m=+0.108666248 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-type=git, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.expose-services=) Feb 20 04:54:59 localhost podman[313603]: 2026-02-20 09:54:59.087904223 +0000 UTC m=+0.043787018 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:54:59 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:54:59 localhost podman[313603]: 2026-02-20 09:54:59.198376746 +0000 UTC m=+0.154259471 container create b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:54:59 localhost systemd[1]: Started libpod-conmon-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a.scope. Feb 20 04:54:59 localhost systemd[1]: Started libcrun container. Feb 20 04:54:59 localhost nova_compute[281288]: 2026-02-20 09:54:59.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:54:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375106b10e9c61e0995150f814346f15b740b6f2eaa5b60963053b920ba7d52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:54:59 localhost podman[313603]: 2026-02-20 09:54:59.28919181 +0000 UTC m=+0.245074565 container init b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 04:54:59 localhost podman[313603]: 2026-02-20 09:54:59.299085101 +0000 UTC m=+0.254967866 container start b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 20 04:54:59 localhost dnsmasq[313641]: started, version 2.85 cachesize 150 Feb 20 04:54:59 localhost dnsmasq[313641]: DNS service limited to local subnets Feb 20 04:54:59 localhost dnsmasq[313641]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:54:59 localhost dnsmasq[313641]: warning: no upstream servers configured Feb 20 04:54:59 localhost dnsmasq-dhcp[313641]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:54:59 localhost dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 0 addresses Feb 20 04:54:59 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host Feb 20 04:54:59 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts Feb 20 04:54:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:59.366 264355 INFO neutron.agent.dhcp.agent [None req-a0184e0e-748b-48ec-b6a6-b11d207ba4c4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c116158e-f1e4-44c0-bd21-ec8f4d275597, ip_allocation=immediate, mac_address=fa:16:3e:7c:7f:62, name=tempest-RoutersIpV6Test-1424937382, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:53Z, description=, dns_domain=, id=2c7c971d-607d-4f86-ac60-49a788864bee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1709331935, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11774, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1901, status=ACTIVE, subnets=['b4df3efd-04f4-4c14-80df-db94b5574306'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:56Z, vlan_transparent=None, network_id=2c7c971d-607d-4f86-ac60-49a788864bee, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['000c42d1-648a-4f56-b7e6-024a1e270fb9'], standard_attr_id=1917, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:57Z on network 2c7c971d-607d-4f86-ac60-49a788864bee#033[00m Feb 20 04:54:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:59.489 264355 INFO neutron.agent.dhcp.agent [None req-f112fb60-b282-4bcb-8ccf-e2d6a24a4f7a - - - - - -] DHCP configuration for ports {'586cb9ae-270d-4e56-9729-a053eb7c9410'} is completed#033[00m Feb 20 04:54:59 localhost dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 1 addresses Feb 20 04:54:59 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host Feb 20 04:54:59 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts Feb 20 04:54:59 localhost podman[313660]: 2026-02-20 09:54:59.548002953 +0000 UTC m=+0.049929856 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:54:59 localhost neutron_sriov_agent[257177]: 2026-02-20 09:54:59.598 2 INFO neutron.agent.securitygroups_rpc [None req-1d54fa0a-8062-4c9c-94e3-875dfdd78a26 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:54:59 localhost nova_compute[281288]: 2026-02-20 09:54:59.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:59 localhost nova_compute[281288]: 2026-02-20 09:54:59.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:54:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:54:59.860 264355 INFO neutron.agent.dhcp.agent [None req-6849c88b-986f-4f18-b3a1-9ff7e4403be8 - - - - - -] DHCP configuration for ports {'c116158e-f1e4-44c0-bd21-ec8f4d275597'} is completed#033[00m Feb 20 04:55:00 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:00.453 2 INFO neutron.agent.securitygroups_rpc [None req-4a29bc3c-e3e4-4d65-b800-f3b558d9c704 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']#033[00m Feb 20 04:55:00 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:00.605 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:55:01 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:01.082 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:57Z, description=, device_id=4cb71357-16b4-46c1-8592-93fdc1f8bfda, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c116158e-f1e4-44c0-bd21-ec8f4d275597, ip_allocation=immediate, mac_address=fa:16:3e:7c:7f:62, name=tempest-RoutersIpV6Test-1424937382, network_id=2c7c971d-607d-4f86-ac60-49a788864bee, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['000c42d1-648a-4f56-b7e6-024a1e270fb9'], standard_attr_id=1917, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:58Z on network 2c7c971d-607d-4f86-ac60-49a788864bee#033[00m Feb 20 04:55:01 localhost dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 1 addresses Feb 20 04:55:01 localhost podman[313699]: 2026-02-20 09:55:01.276985399 +0000 UTC m=+0.060933610 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:55:01 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host Feb 20 04:55:01 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts Feb 20 04:55:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:01 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:01.609 264355 INFO neutron.agent.dhcp.agent [None req-22422d1a-79dd-4cc5-a069-0e9563348e6f - - - - - -] DHCP configuration for ports {'c116158e-f1e4-44c0-bd21-ec8f4d275597'} is completed#033[00m Feb 20 04:55:01 localhost nova_compute[281288]: 2026-02-20 09:55:01.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:01 localhost nova_compute[281288]: 2026-02-20 09:55:01.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.827 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.828 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.829 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:55:02 localhost nova_compute[281288]: 2026-02-20 09:55:02.829 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:55:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:03.810 2 INFO neutron.agent.securitygroups_rpc [None req-06243d38-8e6a-45a5-a465-51e5b9c1322f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:03.922 2 INFO neutron.agent.securitygroups_rpc [None req-befc33c8-098a-487a-a3bd-757b9f93e81d 3ace3fc0d46241ffa2d6d0b16953a588 8aa5b5a34cfe458d96fea87261361db1 - - default default] Security group member updated ['f947e5dd-708d-45fc-8f3d-3e71e4aec5b2']#033[00m Feb 20 04:55:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:03.948 2 INFO neutron.agent.securitygroups_rpc [None req-d0194b4e-ea8b-4209-af3b-a875851573ce 51a4789e7d0b404b9882e0c26f7229be 1c44e13adebb4610b7c0cd2fdc62a5b7 - - default default] Security group member updated ['000c42d1-648a-4f56-b7e6-024a1e270fb9']#033[00m Feb 20 04:55:04 localhost nova_compute[281288]: 2026-02-20 09:55:04.139 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:55:04 localhost nova_compute[281288]: 2026-02-20 09:55:04.159 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:55:04 localhost nova_compute[281288]: 2026-02-20 09:55:04.160 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:55:04 localhost dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 0 addresses Feb 20 04:55:04 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host Feb 20 04:55:04 localhost podman[313737]: 2026-02-20 09:55:04.166274117 +0000 UTC m=+0.044724068 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:55:04 localhost dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts Feb 20 04:55:04 localhost nova_compute[281288]: 2026-02-20 09:55:04.274 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:04 localhost ovn_controller[156798]: 2026-02-20T09:55:04Z|00221|binding|INFO|Releasing lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe from this chassis (sb_readonly=0) Feb 20 04:55:04 localhost kernel: device tapea1cc9c8-ce left promiscuous mode Feb 20 04:55:04 localhost ovn_controller[156798]: 2026-02-20T09:55:04Z|00222|binding|INFO|Setting lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe down in Southbound Feb 20 04:55:04 localhost nova_compute[281288]: 2026-02-20 09:55:04.344 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:04.358 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa788937-4ffe-4167-9bda-a66bb7ab07d7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ea1cc9c8-cec9-46f8-a9aa-5542762442fe) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:04.360 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ea1cc9c8-cec9-46f8-a9aa-5542762442fe in datapath 2c7c971d-607d-4f86-ac60-49a788864bee unbound from our chassis#033[00m Feb 20 04:55:04 localhost nova_compute[281288]: 2026-02-20 09:55:04.361 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:04.363 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2c7c971d-607d-4f86-ac60-49a788864bee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:04.365 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4bffdb3f-bb7e-452f-b656-61e6d36b032c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:04 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:04.956 2 INFO neutron.agent.securitygroups_rpc [None req-a41b9e5e-24c7-4e83-9a8a-33aa24dfcce3 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e142 e142: 6 total, 6 up, 6 in Feb 20 04:55:05 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:05.224 264355 INFO neutron.agent.linux.ip_lib [None req-b1a077cd-edae-41e2-98f3-4f0daeac1dbc - - - - - -] Device tap482793f1-f2 cannot be used as it has no MAC address#033[00m Feb 20 04:55:05 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:05.257 2 INFO neutron.agent.securitygroups_rpc [None req-563cc02b-9571-4400-9563-57af3068006b f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:05 localhost nova_compute[281288]: 2026-02-20 09:55:05.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:05 localhost kernel: device tap482793f1-f2 entered promiscuous mode Feb 20 04:55:05 localhost NetworkManager[5988]: [1771581305.2977] manager: (tap482793f1-f2): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Feb 20 04:55:05 localhost nova_compute[281288]: 2026-02-20 09:55:05.299 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:05 localhost ovn_controller[156798]: 2026-02-20T09:55:05Z|00223|binding|INFO|Claiming lport 482793f1-f265-4388-a4f1-7b2c24064b54 for this chassis. Feb 20 04:55:05 localhost ovn_controller[156798]: 2026-02-20T09:55:05Z|00224|binding|INFO|482793f1-f265-4388-a4f1-7b2c24064b54: Claiming unknown Feb 20 04:55:05 localhost systemd-udevd[313767]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:05.318 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe23:ae62/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e02367d-9b8e-4ff3-90ec-1850f1ce8ba8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=482793f1-f265-4388-a4f1-7b2c24064b54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:05.322 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 482793f1-f265-4388-a4f1-7b2c24064b54 in datapath eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 bound to our chassis#033[00m Feb 20 04:55:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:05.323 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:05.324 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[11e7c59c-d180-4775-8b8a-f796b2268191]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:55:05 localhost ovn_controller[156798]: 2026-02-20T09:55:05Z|00225|binding|INFO|Setting lport 482793f1-f265-4388-a4f1-7b2c24064b54 ovn-installed in OVS Feb 20 04:55:05 localhost ovn_controller[156798]: 2026-02-20T09:55:05Z|00226|binding|INFO|Setting lport 482793f1-f265-4388-a4f1-7b2c24064b54 up in Southbound Feb 20 04:55:05 localhost nova_compute[281288]: 2026-02-20 09:55:05.339 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost journal[229984]: ethtool ioctl error on tap482793f1-f2: No such device Feb 20 04:55:05 localhost nova_compute[281288]: 2026-02-20 09:55:05.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:05 localhost nova_compute[281288]: 2026-02-20 09:55:05.432 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:55:05 localhost systemd[1]: tmp-crun.saTJD0.mount: Deactivated successfully. Feb 20 04:55:05 localhost podman[313773]: 2026-02-20 09:55:05.462735821 +0000 UTC m=+0.111686259 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:55:05 localhost podman[313773]: 2026-02-20 09:55:05.495273719 +0000 UTC m=+0.144224157 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:55:05 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:55:05 localhost podman[313810]: 2026-02-20 09:55:05.563245771 +0000 UTC m=+0.097997484 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible) Feb 20 04:55:05 localhost podman[313810]: 2026-02-20 09:55:05.596217751 +0000 UTC m=+0.130969444 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:55:05 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.018 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.018 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:55:06 localhost podman[313879]: Feb 20 04:55:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e143 e143: 6 total, 6 up, 6 in Feb 20 04:55:06 localhost podman[313879]: 2026-02-20 09:55:06.192068228 +0000 UTC m=+0.089751574 container create 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:55:06 localhost systemd[1]: Started libpod-conmon-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd.scope. Feb 20 04:55:06 localhost podman[313879]: 2026-02-20 09:55:06.14759874 +0000 UTC m=+0.045282106 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:06 localhost systemd[1]: Started libcrun container. Feb 20 04:55:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c74a41ff9564be3fb088581551a25ec5006a1f6afdd9eb855006f6c53fa651d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:06 localhost podman[313879]: 2026-02-20 09:55:06.279852092 +0000 UTC m=+0.177535438 container init 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:06 localhost podman[313879]: 2026-02-20 09:55:06.288715051 +0000 UTC m=+0.186398387 container start 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 20 04:55:06 localhost dnsmasq[313897]: started, version 2.85 cachesize 150 Feb 20 04:55:06 localhost dnsmasq[313897]: DNS service limited to local subnets Feb 20 04:55:06 localhost dnsmasq[313897]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:06 localhost dnsmasq[313897]: warning: no upstream servers configured Feb 20 04:55:06 localhost dnsmasq[313897]: read /var/lib/neutron/dhcp/eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8/addn_hosts - 0 addresses Feb 20 04:55:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:06 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:06.487 264355 INFO neutron.agent.dhcp.agent [None req-4133f662-87a7-42b5-b620-db3d2d9639fd - - - - - -] DHCP configuration for ports {'424d9e79-b09e-4419-bfce-356fcae07ea1'} is completed#033[00m Feb 20 04:55:06 localhost dnsmasq[313897]: exiting on receipt of SIGTERM Feb 20 04:55:06 localhost podman[313916]: 2026-02-20 09:55:06.705722153 +0000 UTC m=+0.048789092 container kill 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:06 localhost systemd[1]: libpod-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd.scope: Deactivated successfully. Feb 20 04:55:06 localhost podman[313928]: 2026-02-20 09:55:06.77617383 +0000 UTC m=+0.055592108 container died 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:55:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:06 localhost podman[313928]: 2026-02-20 09:55:06.807666985 +0000 UTC m=+0.087085223 container cleanup 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:55:06 localhost systemd[1]: libpod-conmon-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd.scope: Deactivated successfully. Feb 20 04:55:06 localhost podman[313930]: 2026-02-20 09:55:06.84408791 +0000 UTC m=+0.118142775 container remove 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:55:06 localhost ovn_controller[156798]: 2026-02-20T09:55:06Z|00227|binding|INFO|Releasing lport 482793f1-f265-4388-a4f1-7b2c24064b54 from this chassis (sb_readonly=0) Feb 20 04:55:06 localhost nova_compute[281288]: 2026-02-20 09:55:06.854 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:06 localhost ovn_controller[156798]: 2026-02-20T09:55:06Z|00228|binding|INFO|Setting lport 482793f1-f265-4388-a4f1-7b2c24064b54 down in Southbound Feb 20 04:55:06 localhost kernel: device tap482793f1-f2 left promiscuous mode Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.871 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe23:ae62/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e02367d-9b8e-4ff3-90ec-1850f1ce8ba8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=482793f1-f265-4388-a4f1-7b2c24064b54) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.873 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 482793f1-f265-4388-a4f1-7b2c24064b54 in datapath eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 unbound from our chassis#033[00m Feb 20 04:55:06 localhost nova_compute[281288]: 2026-02-20 09:55:06.873 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.876 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:06.877 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[671b8dd9-c476-494d-8bf4-ad289185954a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:06 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:06.910 264355 INFO neutron.agent.dhcp.agent [None req-d302ebcb-4aeb-4719-9eb5-b9b9cc423d70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:06 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:06.911 264355 INFO neutron.agent.dhcp.agent [None req-d302ebcb-4aeb-4719-9eb5-b9b9cc423d70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:07 localhost dnsmasq[313641]: exiting on receipt of SIGTERM Feb 20 04:55:07 localhost podman[313972]: 2026-02-20 09:55:07.099100167 +0000 UTC m=+0.041330385 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:55:07 localhost systemd[1]: libpod-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a.scope: Deactivated successfully. Feb 20 04:55:07 localhost podman[313984]: 2026-02-20 09:55:07.15821185 +0000 UTC m=+0.050647477 container died b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:55:07 localhost podman[313984]: 2026-02-20 09:55:07.180506907 +0000 UTC m=+0.072942494 container cleanup b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:55:07 localhost systemd[1]: libpod-conmon-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a.scope: Deactivated successfully. Feb 20 04:55:07 localhost podman[313991]: 2026-02-20 09:55:07.23368435 +0000 UTC m=+0.110935466 container remove b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:55:07 localhost ovn_controller[156798]: 2026-02-20T09:55:07Z|00229|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:55:07 localhost nova_compute[281288]: 2026-02-20 09:55:07.422 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:07 localhost systemd[1]: var-lib-containers-storage-overlay-c74a41ff9564be3fb088581551a25ec5006a1f6afdd9eb855006f6c53fa651d7-merged.mount: Deactivated successfully. Feb 20 04:55:07 localhost systemd[1]: run-netns-qdhcp\x2deddc0ec2\x2dac95\x2d44b0\x2d8ba7\x2d88ad136c7ae8.mount: Deactivated successfully. Feb 20 04:55:07 localhost systemd[1]: var-lib-containers-storage-overlay-1375106b10e9c61e0995150f814346f15b740b6f2eaa5b60963053b920ba7d52-merged.mount: Deactivated successfully. Feb 20 04:55:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:07 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:07.462 2 INFO neutron.agent.securitygroups_rpc [None req-63a5189c-53e8-40d9-9654-04d02c3d7a9f 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:07.565 264355 INFO neutron.agent.dhcp.agent [None req-d60a952b-8db8-40d6-beec-996cc9523d86 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:07 localhost systemd[1]: run-netns-qdhcp\x2d2c7c971d\x2d607d\x2d4f86\x2dac60\x2d49a788864bee.mount: Deactivated successfully. Feb 20 04:55:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:07.650 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:08.153 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e144 e144: 6 total, 6 up, 6 in Feb 20 04:55:08 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:08.623 2 INFO neutron.agent.securitygroups_rpc [None req-08efe737-22a8-48a9-b7b8-e1929d8369c8 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:08 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:08.950 2 INFO neutron.agent.securitygroups_rpc [None req-3b1327f4-a56e-425d-b764-df7dfb03463f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e145 e145: 6 total, 6 up, 6 in Feb 20 04:55:09 localhost nova_compute[281288]: 2026-02-20 09:55:09.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:10 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:10.372 2 INFO neutron.agent.securitygroups_rpc [None req-9a8b2a58-7fed-4400-9808-58a86984ca8a 3ace3fc0d46241ffa2d6d0b16953a588 8aa5b5a34cfe458d96fea87261361db1 - - default default] Security group member updated ['f947e5dd-708d-45fc-8f3d-3e71e4aec5b2']#033[00m Feb 20 04:55:10 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:10.910 264355 INFO neutron.agent.linux.ip_lib [None req-449e4550-801e-4cb5-8a33-c704d023f847 - - - - - -] Device tap43ee6e73-b5 cannot be used as it has no MAC address#033[00m Feb 20 04:55:10 localhost nova_compute[281288]: 2026-02-20 09:55:10.981 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:10 localhost kernel: device tap43ee6e73-b5 entered promiscuous mode Feb 20 04:55:10 localhost NetworkManager[5988]: [1771581310.9897] manager: (tap43ee6e73-b5): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Feb 20 04:55:10 localhost ovn_controller[156798]: 2026-02-20T09:55:10Z|00230|binding|INFO|Claiming lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 for this chassis. Feb 20 04:55:10 localhost ovn_controller[156798]: 2026-02-20T09:55:10Z|00231|binding|INFO|43ee6e73-b59d-4f8c-bc0d-e76444d22449: Claiming unknown Feb 20 04:55:10 localhost nova_compute[281288]: 2026-02-20 09:55:10.991 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:10 localhost systemd-udevd[314024]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.000 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2478f085-2c0d-4bdc-b22a-1c88d748beb7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=43ee6e73-b59d-4f8c-bc0d-e76444d22449) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.003 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 43ee6e73-b59d-4f8c-bc0d-e76444d22449 in datapath 46a8490e-a535-4bce-9ac1-74e63b0f238d bound to our chassis#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.005 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46a8490e-a535-4bce-9ac1-74e63b0f238d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.008 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[909dd26e-35e6-4786-8971-33699d73ebb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00232|binding|INFO|Setting lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 ovn-installed in OVS Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00233|binding|INFO|Setting lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 up in Southbound Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.044 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.090 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.129 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:11.783 2 INFO neutron.agent.securitygroups_rpc [None req-9a050612-ac54-4498-bab7-ac1559fb18bf 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.824 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c3441dcb-93b0-4351-aba2-94a1914f5ed5 with type ""#033[00m Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00234|binding|INFO|Removing iface tap43ee6e73-b5 ovn-installed in OVS Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.826 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2478f085-2c0d-4bdc-b22a-1c88d748beb7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=43ee6e73-b59d-4f8c-bc0d-e76444d22449) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.828 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 43ee6e73-b59d-4f8c-bc0d-e76444d22449 in datapath 46a8490e-a535-4bce-9ac1-74e63b0f238d unbound from our chassis#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.830 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46a8490e-a535-4bce-9ac1-74e63b0f238d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.833 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7641f8-7de4-4731-bcba-bca580f620db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00235|binding|INFO|Removing lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 ovn-installed in OVS Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.835 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:11.904 2 INFO neutron.agent.securitygroups_rpc [None req-36106926-50cd-429d-aa41-30ad5718ad39 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:11 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:11.928 264355 INFO neutron.agent.linux.ip_lib [None req-29401c17-9f88-48b5-9e1b-c3ed33377fb4 - - - - - -] Device tap4771034d-1a cannot be used as it has no MAC address#033[00m Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.943 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost kernel: device tap4771034d-1a entered promiscuous mode Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.949 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost NetworkManager[5988]: [1771581311.9493] manager: (tap4771034d-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00236|binding|INFO|Claiming lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 for this chassis. Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.959 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00237|binding|INFO|4771034d-1a14-45f4-a39e-d4b7e4f8e0b4: Claiming unknown Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.969 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ccf68c5-a7cb-4cfb-a4c3-a082862eb086, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4771034d-1a14-45f4-a39e-d4b7e4f8e0b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.970 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 in datapath db986821-358f-44d6-9e8f-8928c31d10ae bound to our chassis#033[00m Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.971 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network db986821-358f-44d6-9e8f-8928c31d10ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:11.973 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[2df336b2-e543-46e7-a989-16889df2f413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00238|binding|INFO|Setting lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 ovn-installed in OVS Feb 20 04:55:11 localhost ovn_controller[156798]: 2026-02-20T09:55:11Z|00239|binding|INFO|Setting lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 up in Southbound Feb 20 04:55:11 localhost nova_compute[281288]: 2026-02-20 09:55:11.995 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:11 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:12 localhost journal[229984]: ethtool ioctl error on tap4771034d-1a: No such device Feb 20 04:55:12 localhost podman[314102]: Feb 20 04:55:12 localhost nova_compute[281288]: 2026-02-20 09:55:12.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:12 localhost podman[314102]: 2026-02-20 09:55:12.022417847 +0000 UTC m=+0.086912658 container create 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 20 04:55:12 localhost nova_compute[281288]: 2026-02-20 09:55:12.033 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:12 localhost systemd[1]: Started libpod-conmon-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1.scope. Feb 20 04:55:12 localhost systemd[1]: Started libcrun container. Feb 20 04:55:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0338ae8987ed2c464584a36ca2ce03dbc2fc06cedc3471135948acdfc2656ace/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:12 localhost podman[314102]: 2026-02-20 09:55:12.083003684 +0000 UTC m=+0.147498455 container init 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:12 localhost podman[314102]: 2026-02-20 09:55:11.985107784 +0000 UTC m=+0.049602575 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:12 localhost podman[314102]: 2026-02-20 09:55:12.088484811 +0000 UTC m=+0.152979622 container start 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:55:12 localhost dnsmasq[314148]: started, version 2.85 cachesize 150 Feb 20 04:55:12 localhost dnsmasq[314148]: DNS service limited to local subnets Feb 20 04:55:12 localhost dnsmasq[314148]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:12 localhost dnsmasq[314148]: warning: no upstream servers configured Feb 20 04:55:12 localhost dnsmasq-dhcp[314148]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:55:12 localhost dnsmasq[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/addn_hosts - 0 addresses Feb 20 04:55:12 localhost dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/host Feb 20 04:55:12 localhost dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/opts Feb 20 04:55:12 localhost sshd[314149]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:55:12 localhost nova_compute[281288]: 2026-02-20 09:55:12.148 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:12 localhost kernel: device tap43ee6e73-b5 left promiscuous mode Feb 20 04:55:12 localhost nova_compute[281288]: 2026-02-20 09:55:12.164 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.265 264355 INFO neutron.agent.dhcp.agent [None req-7f2d0925-b7d2-4002-908e-9cc18f9ae448 - - - - - -] DHCP configuration for ports {'bdecb4bc-5e0d-4a76-954d-639d3a7f9a03'} is completed#033[00m Feb 20 04:55:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:12.340 2 INFO neutron.agent.securitygroups_rpc [None req-9a050612-ac54-4498-bab7-ac1559fb18bf 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e146 e146: 6 total, 6 up, 6 in Feb 20 04:55:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:55:12 localhost systemd[1]: tmp-crun.6JUUyE.mount: Deactivated successfully. Feb 20 04:55:12 localhost podman[314211]: Feb 20 04:55:12 localhost podman[314211]: 2026-02-20 09:55:12.857859083 +0000 UTC m=+0.111007609 container create c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:55:12 localhost systemd[1]: Started libpod-conmon-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a.scope. Feb 20 04:55:12 localhost podman[314211]: 2026-02-20 09:55:12.809733463 +0000 UTC m=+0.062882049 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:12 localhost podman[314222]: 2026-02-20 09:55:12.911201752 +0000 UTC m=+0.091988232 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:55:12 localhost podman[314226]: 2026-02-20 09:55:12.933034604 +0000 UTC m=+0.107657638 container kill 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:12 localhost dnsmasq[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/addn_hosts - 0 addresses Feb 20 04:55:12 localhost dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/host Feb 20 04:55:12 localhost dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/opts Feb 20 04:55:12 localhost systemd[1]: Started libcrun container. Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 46a8490e-a535-4bce-9ac1-74e63b0f238d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap43ee6e73-b5 not found in namespace qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d. Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 20 04:55:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24261156bad49a8305fefc4579d7635ae2afa5b6e97772dd5140da521b7fab16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent return fut.result() Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent raise self._exception Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap43ee6e73-b5 not found in namespace qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d. Feb 20 04:55:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent #033[00m Feb 20 04:55:12 localhost podman[314222]: 2026-02-20 09:55:12.96027029 +0000 UTC m=+0.141056710 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:12 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:55:12 localhost podman[314211]: 2026-02-20 09:55:12.976833362 +0000 UTC m=+0.229981938 container init c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:55:12 localhost podman[314211]: 2026-02-20 09:55:12.982661679 +0000 UTC m=+0.235810235 container start c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 20 04:55:12 localhost dnsmasq[314263]: started, version 2.85 cachesize 150 Feb 20 04:55:12 localhost dnsmasq[314263]: DNS service limited to local subnets Feb 20 04:55:12 localhost dnsmasq[314263]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:12 localhost dnsmasq[314263]: warning: no upstream servers configured Feb 20 04:55:12 localhost dnsmasq-dhcp[314263]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:55:12 localhost dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 0 addresses Feb 20 04:55:12 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host Feb 20 04:55:12 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.027 264355 INFO neutron.agent.dhcp.agent [None req-1a2e3763-4058-4706-bf05-99d8b392f202 - - - - - -] Synchronizing state#033[00m Feb 20 04:55:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:13.065 2 INFO neutron.agent.securitygroups_rpc [None req-5b112f63-69ca-4cb8-af5b-6c4c1770d4b6 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.121 264355 INFO neutron.agent.dhcp.agent [None req-68834c23-3df4-4658-b0ad-65ce3e176213 - - - - - -] DHCP configuration for ports {'cd05ab2e-f393-45d1-8131-90841abef68e'} is completed#033[00m Feb 20 04:55:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:13.280 2 INFO neutron.agent.securitygroups_rpc [None req-624534fd-50bc-4fd0-a351-6a4578734382 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.315 264355 INFO neutron.agent.dhcp.agent [None req-c2bf4659-23a7-4d94-a36c-8a2e456b8aa4 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.316 264355 INFO neutron.agent.dhcp.agent [-] Starting network 46a8490e-a535-4bce-9ac1-74e63b0f238d dhcp configuration#033[00m Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.316 264355 INFO neutron.agent.dhcp.agent [-] Finished network 46a8490e-a535-4bce-9ac1-74e63b0f238d dhcp configuration#033[00m Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.317 264355 INFO neutron.agent.dhcp.agent [None req-c2bf4659-23a7-4d94-a36c-8a2e456b8aa4 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.317 264355 INFO neutron.agent.dhcp.agent [None req-29401c17-9f88-48b5-9e1b-c3ed33377fb4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:12Z, description=, device_id=f26d36e8-7d9e-47ed-9784-1de0d2ce9ae4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=dac9892a-d1b0-4086-9174-145617b0e7c9, ip_allocation=immediate, mac_address=fa:16:3e:71:14:9a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:08Z, description=, dns_domain=, id=db986821-358f-44d6-9e8f-8928c31d10ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-904494756, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12149, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1973, status=ACTIVE, subnets=['92327ff9-f3f4-4fb2-9234-a9395798504e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:10Z, vlan_transparent=None, network_id=db986821-358f-44d6-9e8f-8928c31d10ae, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1988, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:12Z on network db986821-358f-44d6-9e8f-8928c31d10ae#033[00m Feb 20 04:55:13 localhost ovn_controller[156798]: 2026-02-20T09:55:13Z|00240|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:55:13 localhost nova_compute[281288]: 2026-02-20 09:55:13.528 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:13 localhost dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 1 addresses Feb 20 04:55:13 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host Feb 20 04:55:13 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts Feb 20 04:55:13 localhost podman[314290]: 2026-02-20 09:55:13.565274035 +0000 UTC m=+0.103667326 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:55:13 localhost dnsmasq[314148]: exiting on receipt of SIGTERM Feb 20 04:55:13 localhost podman[314312]: 2026-02-20 09:55:13.692848736 +0000 UTC m=+0.059671761 container kill 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:55:13 localhost systemd[1]: libpod-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1.scope: Deactivated successfully. Feb 20 04:55:13 localhost podman[314330]: 2026-02-20 09:55:13.768411528 +0000 UTC m=+0.059644420 container died 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 20 04:55:13 localhost podman[314330]: 2026-02-20 09:55:13.799298186 +0000 UTC m=+0.090531038 container cleanup 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:55:13 localhost systemd[1]: libpod-conmon-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1.scope: Deactivated successfully. Feb 20 04:55:13 localhost systemd[1]: var-lib-containers-storage-overlay-0338ae8987ed2c464584a36ca2ce03dbc2fc06cedc3471135948acdfc2656ace-merged.mount: Deactivated successfully. Feb 20 04:55:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:13 localhost podman[314333]: 2026-02-20 09:55:13.855973645 +0000 UTC m=+0.140988588 container remove 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.886 264355 INFO neutron.agent.dhcp.agent [None req-e1eb74c7-20a6-4ab1-b283-f34bddc1a110 - - - - - -] DHCP configuration for ports {'dac9892a-d1b0-4086-9174-145617b0e7c9'} is completed#033[00m Feb 20 04:55:13 localhost systemd[1]: run-netns-qdhcp\x2d46a8490e\x2da535\x2d4bce\x2d9ac1\x2d74e63b0f238d.mount: Deactivated successfully. Feb 20 04:55:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.970 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:12Z, description=, device_id=f26d36e8-7d9e-47ed-9784-1de0d2ce9ae4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=dac9892a-d1b0-4086-9174-145617b0e7c9, ip_allocation=immediate, mac_address=fa:16:3e:71:14:9a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:08Z, description=, dns_domain=, id=db986821-358f-44d6-9e8f-8928c31d10ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-904494756, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12149, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1973, status=ACTIVE, subnets=['92327ff9-f3f4-4fb2-9234-a9395798504e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:10Z, vlan_transparent=None, network_id=db986821-358f-44d6-9e8f-8928c31d10ae, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1988, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:12Z on network db986821-358f-44d6-9e8f-8928c31d10ae#033[00m Feb 20 04:55:14 localhost podman[314375]: 2026-02-20 09:55:14.161616338 +0000 UTC m=+0.060011382 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:55:14 localhost dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 1 addresses Feb 20 04:55:14 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host Feb 20 04:55:14 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts Feb 20 04:55:14 localhost nova_compute[281288]: 2026-02-20 09:55:14.280 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:14 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:14.480 264355 INFO neutron.agent.dhcp.agent [None req-92ba39a7-7cc4-4605-856e-f6754aa06c8e - - - - - -] DHCP configuration for ports {'dac9892a-d1b0-4086-9174-145617b0e7c9'} is completed#033[00m Feb 20 04:55:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:14.506 2 INFO neutron.agent.securitygroups_rpc [None req-cfd67c24-e1f2-4ca4-a53e-99c158a0738a 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:14 localhost podman[314414]: 2026-02-20 09:55:14.790458127 +0000 UTC m=+0.062737175 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:14 localhost dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 0 addresses Feb 20 04:55:14 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host Feb 20 04:55:14 localhost dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts Feb 20 04:55:14 localhost kernel: device tap4771034d-1a left promiscuous mode Feb 20 04:55:14 localhost nova_compute[281288]: 2026-02-20 09:55:14.996 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:14 localhost ovn_controller[156798]: 2026-02-20T09:55:14Z|00241|binding|INFO|Releasing lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 from this chassis (sb_readonly=0) Feb 20 04:55:14 localhost ovn_controller[156798]: 2026-02-20T09:55:14Z|00242|binding|INFO|Setting lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 down in Southbound Feb 20 04:55:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:15.005 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ccf68c5-a7cb-4cfb-a4c3-a082862eb086, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4771034d-1a14-45f4-a39e-d4b7e4f8e0b4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:15.007 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 in datapath db986821-358f-44d6-9e8f-8928c31d10ae unbound from our chassis#033[00m Feb 20 04:55:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:15.009 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network db986821-358f-44d6-9e8f-8928c31d10ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:15.010 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b04ce51c-392a-4be2-9233-53aeb8eec8a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:15 localhost nova_compute[281288]: 2026-02-20 09:55:15.023 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:15 localhost nova_compute[281288]: 2026-02-20 09:55:15.025 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:16.010 2 INFO neutron.agent.securitygroups_rpc [None req-7271509a-2575-4000-bc96-a2fd7601216d f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e147 e147: 6 total, 6 up, 6 in Feb 20 04:55:17 localhost podman[314454]: 2026-02-20 09:55:17.082125773 +0000 UTC m=+0.062979422 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:55:17 localhost dnsmasq[314263]: exiting on receipt of SIGTERM Feb 20 04:55:17 localhost systemd[1]: libpod-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a.scope: Deactivated successfully. Feb 20 04:55:17 localhost podman[314469]: 2026-02-20 09:55:17.152681774 +0000 UTC m=+0.053185725 container died c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:55:17 localhost systemd[1]: tmp-crun.P0oI4q.mount: Deactivated successfully. Feb 20 04:55:17 localhost podman[314469]: 2026-02-20 09:55:17.261931998 +0000 UTC m=+0.162435909 container cleanup c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:17 localhost systemd[1]: libpod-conmon-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a.scope: Deactivated successfully. Feb 20 04:55:17 localhost podman[314468]: 2026-02-20 09:55:17.281148381 +0000 UTC m=+0.179266019 container remove c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 20 04:55:17 localhost podman[241968]: time="2026-02-20T09:55:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:55:17 localhost podman[241968]: @ - - [20/Feb/2026:09:55:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:55:17 localhost podman[241968]: @ - - [20/Feb/2026:09:55:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18339 "" "Go-http-client/1.1" Feb 20 04:55:17 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:17.794 264355 INFO neutron.agent.dhcp.agent [None req-9a99823c-4a01-4153-b1a7-1f995f7ae87d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:17 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:17.803 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:17 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:17.977 2 INFO neutron.agent.securitygroups_rpc [None req-99b1b964-36a8-407d-a34f-bd7246c382f8 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:17 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:17.981 2 INFO neutron.agent.securitygroups_rpc [None req-9d512101-812d-472a-bd29-050847053b0a f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:18 localhost systemd[1]: var-lib-containers-storage-overlay-24261156bad49a8305fefc4579d7635ae2afa5b6e97772dd5140da521b7fab16-merged.mount: Deactivated successfully. Feb 20 04:55:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:18 localhost systemd[1]: run-netns-qdhcp\x2ddb986821\x2d358f\x2d44d6\x2d9e8f\x2d8928c31d10ae.mount: Deactivated successfully. Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.209 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.216 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0308ddb8-8650-4867-bc8a-8d85a653999e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.211079', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4322aa3e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '63cf382f663fa482fad6af017b88d9ee6459b2237bd74e76afe3216cabd06b68'}]}, 'timestamp': '2026-02-20 09:55:18.217462', '_unique_id': '7937c235e65d4e6a8236af6252fa6e01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.250 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44163411-ae98-424f-998f-0e527e369af1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.220258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4327b074-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '62b06f7a25b900e95b3a6bdfb5b0558a8f51b45690ac0b0769615b803d7939b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.220258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4327c21c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'ccbcd25ea41cd0844b3434cf3f236fba7462ed1068a826746d999243eac5defa'}]}, 'timestamp': '2026-02-20 09:55:18.250829', '_unique_id': '4365349e2d124a289570e4163eab5f10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8e93a2-e0b1-454f-9d9c-9a709257b9ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.253255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43283486-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '9413e6ea9816c541620301ece107fc5adb0ee1c89ca4d35b0fb11d3f18b1cb30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.253255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '432848b8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '3497e4f40c0e2efece87ea6b3d4343e0da1dbf15252203b57fe9a6d5ffde5242'}]}, 'timestamp': '2026-02-20 09:55:18.254204', '_unique_id': 'f35b3f2a900b417ca8dff3ee39eabd44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb70fa9b-a88a-4853-968f-56087c9c039d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.256392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '432a5f18-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': 'c75c356c2436840165ca0af7d778ba1e00768ae4e16de6899fc8ffc44fd6ee59'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.256392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '432a6fe4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': '9977d24a0f71dedc1f83b5eabc49a8c809323fcaeac35c4efa43ec55456fefe8'}]}, 'timestamp': '2026-02-20 09:55:18.268311', '_unique_id': 'a4d7fddbe39d498995026a3a78be989e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost systemd-journald[48359]: Data hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Feb 20 04:55:18 localhost systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.270 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98406af6-c826-40eb-b148-76340e527aa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.270509', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432ad7f4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '411becbbefcbc37233a0ae4decd56bc696779d886367eca4bc6f6efa0b72fb70'}]}, 'timestamp': '2026-02-20 09:55:18.271005', '_unique_id': 'b93e308a056a4d95b07b03e92b101d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.273 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd111b78d-348b-4a7b-9f19-1a3c09e9ede3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.273221', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432b402c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '2b4adb65eb3df8022083599668183c3d1ea908bba1deb464abba07bdcf2f82b2'}]}, 'timestamp': '2026-02-20 09:55:18.273702', '_unique_id': '0ffeeffc188f4f43bcd255800f7bdfdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef335e9a-cfda-48f7-b014-ddd0a841cd00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.276011', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432bad46-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'd8375db9499f9b5254a5f8432da70f0f9abe72c3c824139b79b7a881030230ab'}]}, 'timestamp': '2026-02-20 09:55:18.276463', '_unique_id': 'a65f60537c4b42789e5bac8bf44789cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04967746-a53c-4fe3-8cad-6e59a9f4f6e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.278997', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432c2208-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '97e86b9220f5da1c3f5d018fe7ef98efce66ec86b833e89df17b4e419a417002'}]}, 'timestamp': '2026-02-20 09:55:18.279453', '_unique_id': 'dde045211b804f33a2aab4f7e523a295'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4b3f8c6-eb09-44e5-beea-840b5dbc672a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.281522', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432c86ee-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'd2d03f4bb5dad94a0aff4a6f5763a032190ebdf571302846a2a57d561bc05d34'}]}, 'timestamp': '2026-02-20 09:55:18.282034', '_unique_id': '60f2e83d851f4fe6b1b8a7d0f128677b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9af111ee-ec6e-4cc4-879f-f2d5ab0148c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:55:18.284072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '432f9276-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.540533254, 'message_signature': '0062a99a0209afba08732627cd7cb65f8fcd15add4a890c90e0fc4bdaf4a40b7'}]}, 'timestamp': '2026-02-20 09:55:18.301986', '_unique_id': '061b1602d7944239acc3d315f3cb2ecc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbb53467-3376-4df7-ac26-e158451d8975', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.304246', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432ffc98-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'ae85aacafc0663775f1edcd41e01318ffca6c598536af7d30295a24451e20ea1'}]}, 'timestamp': '2026-02-20 09:55:18.304747', '_unique_id': '51da9bf793634c3c8f2b7f79f198695e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1afdc59a-ea37-499c-95a3-0e8121a0ec40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.306963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '433066d8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': '4629f42c1ad43a1dc0cdf3a7bbbc95c77378a444ed7356e340204f7a2022bb51'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.306963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43307768-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': 'bf4232903a3d996515942c143259a53122e7caf998ff0b48f263733955ffbf50'}]}, 'timestamp': '2026-02-20 09:55:18.307872', '_unique_id': 'eff9f0b618664d44bc3d3e9a789d5f14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.310 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.310 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '736651c6-a8ad-4877-bff7-ac58406d5a72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.310121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4330e20c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '4122abc75fb9d2bf08e26f33af00d9bf6a0b0020e28eaba6200165e561559e77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.310121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4330f4ae-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '72ba8afe4fb3e68949cb506385511bebbf099593903c4df189618a6ab62eac6a'}]}, 'timestamp': '2026-02-20 09:55:18.311056', '_unique_id': '0f82dd8f55f74422aeba655f169d4be9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'caf433e0-32e5-4bab-909e-4bfc15d44c88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.313215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43315a5c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'c1600fc6d8abcb3061beb07e4ddb111ea428fe6c71654936c64b2b5e63f2aefc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.313215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43316c4a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '8e29aa5499498053c608e5b3492fbf3b18bdc39cb30fe5ce8df15b66cffeb971'}]}, 'timestamp': '2026-02-20 09:55:18.314091', '_unique_id': '2acbee8351bd4eb3a2af110c97f1c2d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b46f38c2-e419-4619-b2cd-10c2f5dde1cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.316180', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4331ce56-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '54bd21f34d2766aa42ebe533602971bfe04c96f10e98a3e8b401075a4650df4b'}]}, 'timestamp': '2026-02-20 09:55:18.316629', '_unique_id': 'e409c98ab6a84ebdb694b181e8056b74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.318 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.318 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '009aeb26-c6cf-4f0f-95e4-29bf67b46048', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.318740', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4332326a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '5ffc244c730b7033abdba2b8406703f1aa608c65ec5269b5f2af1c03bd53bd40'}]}, 'timestamp': '2026-02-20 09:55:18.319247', '_unique_id': '117fa2c6214f49749396800a58e87c07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '693ff355-b5c5-435c-aa89-512c3e16b5e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.321335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43329782-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'da614cccb741f8e53f2c3ec5ca24a7a1c1c6211f7084557534ab155edceb4b29'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.321335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4332ab00-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'cc6876b3bd6d2295d84d0b40c04a7e5501383d17185b3bd1a4d579a56cfa83ed'}]}, 'timestamp': '2026-02-20 09:55:18.322252', '_unique_id': '150e5deea8224e0bb80ca24d3327d1b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.324 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69e7eab3-8d1a-4c17-af5e-aaeea9e3e0c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.324321', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '43330c62-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'e5b25c0abba0cea62f63610e1c1b7d4090ad30f9e2662102fc964bbf2d23b0e9'}]}, 'timestamp': '2026-02-20 09:55:18.324878', '_unique_id': '0bb8a038482341918b9e6b40dff29427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04732ff0-c629-4fc2-9c0c-fa1138e6e4cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.326797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '433369b4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '335db5c59eb2fbbdba444b71465c673085f44b3d2b1861fa69b5034b809a58e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.326797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '433373d2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '5d07d7a4fda264a1159146392832a34356c4216864c19f52cb21c22e711e0508'}]}, 'timestamp': '2026-02-20 09:55:18.327315', '_unique_id': '09ceea6dcc0a4ca2ae4d96e08ee497aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 17620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecfde883-9c84-4eff-a169-1f355ae8f96e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17620000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:55:18.328599', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4333b194-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.540533254, 'message_signature': 'b9978cf158c01db6a69960fffb4bd53a511a36252447e19747e0037f55df9c7b'}]}, 'timestamp': '2026-02-20 09:55:18.328904', '_unique_id': '5d51c6af30c74ad5a142596c02282e34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb76359a-0057-4f43-b30e-f35af43e7983', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.330209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4333ef06-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': 'af8167d715ad692102645db062ec8b445d65ebe912d60a13d15d3686a9866ee6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.330209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4333f898-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': '99fb62d7a2fc51c92ee5dfba3c70eaf9d91c706eea52bcd4e14fead690219d9d'}]}, 'timestamp': '2026-02-20 09:55:18.330735', '_unique_id': '1331dc289bed47e49872781d181b2646'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:55:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 04:55:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e148 e148: 6 total, 6 up, 6 in Feb 20 04:55:18 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:18.371 2 INFO neutron.agent.securitygroups_rpc [None req-6afebda2-88bb-41f1-8c70-a0608f1757d1 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:18 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:18.396 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:18 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 20 04:55:19 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:19.207 264355 INFO neutron.agent.linux.ip_lib [None req-3e6675b4-0a31-4041-8d9b-35d52c04f656 - - - - - -] Device tapad165bef-73 cannot be used as it has no MAC address#033[00m Feb 20 04:55:19 localhost ovn_controller[156798]: 2026-02-20T09:55:19Z|00243|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.276 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.285 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost kernel: device tapad165bef-73 entered promiscuous mode Feb 20 04:55:19 localhost NetworkManager[5988]: [1771581319.2935] manager: (tapad165bef-73): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Feb 20 04:55:19 localhost ovn_controller[156798]: 2026-02-20T09:55:19Z|00244|binding|INFO|Claiming lport ad165bef-73b3-4f1e-864a-c401cd5b89ed for this chassis. Feb 20 04:55:19 localhost ovn_controller[156798]: 2026-02-20T09:55:19Z|00245|binding|INFO|ad165bef-73b3-4f1e-864a-c401cd5b89ed: Claiming unknown Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost systemd-udevd[314507]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:19.310 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081736e4-c4cd-4a80-b501-1dcc1e64a740, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ad165bef-73b3-4f1e-864a-c401cd5b89ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:19.312 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ad165bef-73b3-4f1e-864a-c401cd5b89ed in datapath 1569f18d-1cf2-4113-a1bd-2d35906eb20f bound to our chassis#033[00m Feb 20 04:55:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:19.314 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1569f18d-1cf2-4113-a1bd-2d35906eb20f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:19.315 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7c09c8-edeb-4c8c-b079-d9ce8a0f0a88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.326 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost ovn_controller[156798]: 2026-02-20T09:55:19Z|00246|binding|INFO|Setting lport ad165bef-73b3-4f1e-864a-c401cd5b89ed ovn-installed in OVS Feb 20 04:55:19 localhost ovn_controller[156798]: 2026-02-20T09:55:19Z|00247|binding|INFO|Setting lport ad165bef-73b3-4f1e-864a-c401cd5b89ed up in Southbound Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.331 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.332 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.370 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost nova_compute[281288]: 2026-02-20 09:55:19.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:55:19 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1519354984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:55:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:55:19 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1519354984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:55:20 localhost podman[314560]: Feb 20 04:55:20 localhost podman[314560]: 2026-02-20 09:55:20.274800575 +0000 UTC m=+0.090602300 container create 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:55:20 localhost systemd[1]: Started libpod-conmon-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d.scope. Feb 20 04:55:20 localhost podman[314560]: 2026-02-20 09:55:20.231867472 +0000 UTC m=+0.047669217 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:20 localhost systemd[1]: tmp-crun.d7Xkyv.mount: Deactivated successfully. Feb 20 04:55:20 localhost systemd[1]: Started libcrun container. Feb 20 04:55:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7724903b31542ba17b88488ef9f53de08d696e33d16bf9339b383229babbfc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:20 localhost podman[314560]: 2026-02-20 09:55:20.372141468 +0000 UTC m=+0.187943193 container init 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 04:55:20 localhost podman[314560]: 2026-02-20 09:55:20.388147154 +0000 UTC m=+0.203948879 container start 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:55:20 localhost dnsmasq[314579]: started, version 2.85 cachesize 150 Feb 20 04:55:20 localhost dnsmasq[314579]: DNS service limited to local subnets Feb 20 04:55:20 localhost dnsmasq[314579]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:20 localhost dnsmasq[314579]: warning: no upstream servers configured Feb 20 04:55:20 localhost dnsmasq-dhcp[314579]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:55:20 localhost dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 0 addresses Feb 20 04:55:20 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host Feb 20 04:55:20 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts Feb 20 04:55:20 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:20.590 264355 INFO neutron.agent.dhcp.agent [None req-0c4b6dd7-1e41-4948-91e1-7a8e88bd4fa7 - - - - - -] DHCP configuration for ports {'c8bb9739-380a-474c-a432-b1d24c789ff6'} is completed#033[00m Feb 20 04:55:20 localhost sshd[314580]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:55:21 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:21.050 2 INFO neutron.agent.securitygroups_rpc [None req-0c0c6410-fbc5-4b85-ab45-3c003033a966 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:21 localhost systemd[1]: tmp-crun.N3vkPb.mount: Deactivated successfully. Feb 20 04:55:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:22 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:22.618 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:22Z, description=, device_id=be6b9a78-3f77-4a1d-b448-ffccd7465c5d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5af3f48a-8899-4613-9124-3d7003577e8a, ip_allocation=immediate, mac_address=fa:16:3e:09:ac:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:16Z, description=, dns_domain=, id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1040364741, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29694, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2005, status=ACTIVE, subnets=['a2044cb0-b4a9-4d72-a4e0-16c7a2c0d8d7'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:17Z, vlan_transparent=None, network_id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2036, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:22Z on network 1569f18d-1cf2-4113-a1bd-2d35906eb20f#033[00m Feb 20 04:55:22 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 e149: 6 total, 6 up, 6 in Feb 20 04:55:22 localhost dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 1 addresses Feb 20 04:55:22 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host Feb 20 04:55:22 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts Feb 20 04:55:22 localhost podman[314599]: 2026-02-20 09:55:22.844878189 +0000 UTC m=+0.070260313 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:22 localhost systemd[1]: tmp-crun.BnEo5U.mount: Deactivated successfully. Feb 20 04:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:55:22 localhost podman[314614]: 2026-02-20 09:55:22.960643231 +0000 UTC m=+0.077923045 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:55:22 localhost podman[314614]: 2026-02-20 09:55:22.969276703 +0000 UTC m=+0.086556517 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:55:22 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:55:23 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:23.197 264355 INFO neutron.agent.dhcp.agent [None req-73843a85-b37f-499a-b518-782de643ab28 - - - - - -] DHCP configuration for ports {'5af3f48a-8899-4613-9124-3d7003577e8a'} is completed#033[00m Feb 20 04:55:23 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:23.358 2 INFO neutron.agent.securitygroups_rpc [None req-6678ed04-c4d3-4555-9813-927645c955fd 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']#033[00m Feb 20 04:55:23 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:23.540 2 INFO neutron.agent.securitygroups_rpc [None req-08ae8aba-2609-40c5-899b-086a86995061 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:24 localhost nova_compute[281288]: 2026-02-20 09:55:24.319 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:24 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:24.577 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:22Z, description=, device_id=be6b9a78-3f77-4a1d-b448-ffccd7465c5d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5af3f48a-8899-4613-9124-3d7003577e8a, ip_allocation=immediate, mac_address=fa:16:3e:09:ac:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:16Z, description=, dns_domain=, id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1040364741, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29694, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2005, status=ACTIVE, subnets=['a2044cb0-b4a9-4d72-a4e0-16c7a2c0d8d7'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:17Z, vlan_transparent=None, network_id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2036, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:22Z on network 1569f18d-1cf2-4113-a1bd-2d35906eb20f#033[00m Feb 20 04:55:24 localhost podman[314660]: 2026-02-20 09:55:24.801914462 +0000 UTC m=+0.064808467 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:55:24 localhost dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 1 addresses Feb 20 04:55:24 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host Feb 20 04:55:24 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts Feb 20 04:55:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:25.074 264355 INFO neutron.agent.dhcp.agent [None req-42ee7eb4-b053-4e18-907b-4ebfb8bb6b7a - - - - - -] DHCP configuration for ports {'5af3f48a-8899-4613-9124-3d7003577e8a'} is completed#033[00m Feb 20 04:55:25 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:25.589 2 INFO neutron.agent.securitygroups_rpc [None req-65f746b2-e25c-42a6-a0af-3bc4d3abfc01 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']#033[00m Feb 20 04:55:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:26 localhost openstack_network_exporter[244414]: ERROR 09:55:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:55:26 localhost openstack_network_exporter[244414]: Feb 20 04:55:26 localhost openstack_network_exporter[244414]: ERROR 09:55:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:55:26 localhost openstack_network_exporter[244414]: Feb 20 04:55:26 localhost dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 0 addresses Feb 20 04:55:26 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host Feb 20 04:55:26 localhost podman[314695]: 2026-02-20 09:55:26.822383312 +0000 UTC m=+0.066539100 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:55:26 localhost dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts Feb 20 04:55:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:55:26 localhost podman[314708]: 2026-02-20 09:55:26.939249467 +0000 UTC m=+0.090430444 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:55:26 localhost podman[314708]: 2026-02-20 09:55:26.946759645 +0000 UTC m=+0.097940592 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:55:26 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:55:27 localhost ovn_controller[156798]: 2026-02-20T09:55:27Z|00248|binding|INFO|Releasing lport ad165bef-73b3-4f1e-864a-c401cd5b89ed from this chassis (sb_readonly=0) Feb 20 04:55:27 localhost kernel: device tapad165bef-73 left promiscuous mode Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.210 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost ovn_controller[156798]: 2026-02-20T09:55:27Z|00249|binding|INFO|Setting lport ad165bef-73b3-4f1e-864a-c401cd5b89ed down in Southbound Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.224 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081736e4-c4cd-4a80-b501-1dcc1e64a740, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ad165bef-73b3-4f1e-864a-c401cd5b89ed) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.226 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ad165bef-73b3-4f1e-864a-c401cd5b89ed in datapath 1569f18d-1cf2-4113-a1bd-2d35906eb20f unbound from our chassis#033[00m Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.228 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1569f18d-1cf2-4113-a1bd-2d35906eb20f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.229 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9e14a83a-d1e2-441d-8bab-67d905c9846a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.244 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:27.323 2 INFO neutron.agent.securitygroups_rpc [None req-09c13c9e-9ba3-4904-bcac-09b2f5e1651f 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:27 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:27.573 264355 INFO neutron.agent.linux.ip_lib [None req-09c5e945-f56a-45bc-a62e-02d6db468313 - - - - - -] Device tapfa0c1fb0-9b cannot be used as it has no MAC address#033[00m Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.596 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost kernel: device tapfa0c1fb0-9b entered promiscuous mode Feb 20 04:55:27 localhost NetworkManager[5988]: [1771581327.6062] manager: (tapfa0c1fb0-9b): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Feb 20 04:55:27 localhost ovn_controller[156798]: 2026-02-20T09:55:27Z|00250|binding|INFO|Claiming lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a for this chassis. Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost ovn_controller[156798]: 2026-02-20T09:55:27Z|00251|binding|INFO|fa0c1fb0-9b16-4f58-88ba-a30e77afed6a: Claiming unknown Feb 20 04:55:27 localhost systemd-udevd[314749]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.619 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb276814-fa37-4413-a869-338c3a114128, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fa0c1fb0-9b16-4f58-88ba-a30e77afed6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.621 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fa0c1fb0-9b16-4f58-88ba-a30e77afed6a in datapath 30554b8e-97d6-457c-a559-c8a175beb267 bound to our chassis#033[00m Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.626 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30554b8e-97d6-457c-a559-c8a175beb267 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:27.627 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8c22cb-949a-4307-a889-62772023e6c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:27 localhost ovn_controller[156798]: 2026-02-20T09:55:27Z|00252|binding|INFO|Setting lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a ovn-installed in OVS Feb 20 04:55:27 localhost ovn_controller[156798]: 2026-02-20T09:55:27Z|00253|binding|INFO|Setting lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a up in Southbound Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:27 localhost nova_compute[281288]: 2026-02-20 09:55:27.708 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:28 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:28.214 2 INFO neutron.agent.securitygroups_rpc [None req-df2a63c6-7355-4f99-a5c3-49ea7a77359b 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:28 localhost podman[314804]: Feb 20 04:55:28 localhost podman[314804]: 2026-02-20 09:55:28.495203744 +0000 UTC m=+0.090049573 container create 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:28 localhost systemd[1]: Started libpod-conmon-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad.scope. Feb 20 04:55:28 localhost podman[314804]: 2026-02-20 09:55:28.4509192 +0000 UTC m=+0.045765069 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:28 localhost systemd[1]: Started libcrun container. Feb 20 04:55:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/285bed8d22a682481e23070c8097cc1e4b38ac43cbaebd1571bb61275146e81a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:28 localhost podman[314804]: 2026-02-20 09:55:28.577197571 +0000 UTC m=+0.172043400 container init 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:55:28 localhost podman[314804]: 2026-02-20 09:55:28.592221817 +0000 UTC m=+0.187067666 container start 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:28 localhost dnsmasq[314824]: started, version 2.85 cachesize 150 Feb 20 04:55:28 localhost dnsmasq[314824]: DNS service limited to local subnets Feb 20 04:55:28 localhost dnsmasq[314824]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:28 localhost dnsmasq[314824]: warning: no upstream servers configured Feb 20 04:55:28 localhost dnsmasq-dhcp[314824]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:55:28 localhost dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 0 addresses Feb 20 04:55:28 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host Feb 20 04:55:28 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts Feb 20 04:55:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:28.659 264355 INFO neutron.agent.dhcp.agent [None req-09c5e945-f56a-45bc-a62e-02d6db468313 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:27Z, description=, device_id=cf3f380b-2884-4a30-946d-3fd2eacae5d3, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6aed2313-a15f-4b24-8da1-1ee435817e36, ip_allocation=immediate, mac_address=fa:16:3e:24:c2:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:25Z, description=, dns_domain=, id=30554b8e-97d6-457c-a559-c8a175beb267, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1940356692, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31716, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2052, status=ACTIVE, subnets=['7584d118-7e5d-407c-8050-c5c8e10e09f6'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:26Z, vlan_transparent=None, network_id=30554b8e-97d6-457c-a559-c8a175beb267, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2061, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:27Z on network 30554b8e-97d6-457c-a559-c8a175beb267#033[00m Feb 20 04:55:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:28.755 264355 INFO neutron.agent.dhcp.agent [None req-a7ceac08-d506-4028-8026-9e8d2146a00e - - - - - -] DHCP configuration for ports {'b59c76b5-90e3-41db-af11-d782540bc114'} is completed#033[00m Feb 20 04:55:28 localhost dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 1 addresses Feb 20 04:55:28 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host Feb 20 04:55:28 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts Feb 20 04:55:28 localhost podman[314856]: 2026-02-20 09:55:28.876317466 +0000 UTC m=+0.064032073 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:28 localhost podman[314868]: 2026-02-20 09:55:28.928867371 +0000 UTC m=+0.064569370 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:55:28 localhost dnsmasq[314579]: exiting on receipt of SIGTERM Feb 20 04:55:28 localhost systemd[1]: libpod-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d.scope: Deactivated successfully. Feb 20 04:55:29 localhost podman[314885]: 2026-02-20 09:55:29.00104072 +0000 UTC m=+0.058903178 container died 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:55:29 localhost podman[314885]: 2026-02-20 09:55:29.084336237 +0000 UTC m=+0.142198685 container cleanup 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:55:29 localhost systemd[1]: libpod-conmon-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d.scope: Deactivated successfully. Feb 20 04:55:29 localhost podman[314892]: 2026-02-20 09:55:29.107420847 +0000 UTC m=+0.142127102 container remove 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:55:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.136 264355 INFO neutron.agent.dhcp.agent [None req-64c46c8e-5526-4f37-aa14-55f8204a4df1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:29 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:29.147 2 INFO neutron.agent.securitygroups_rpc [None req-abf59bbc-b23d-40a5-812c-25dd2e2a84ba 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.219 264355 INFO neutron.agent.dhcp.agent [None req-d74ca894-b05f-4447-b3df-cf0f8ceec184 - - - - - -] DHCP configuration for ports {'6aed2313-a15f-4b24-8da1-1ee435817e36'} is completed#033[00m Feb 20 04:55:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:55:29 localhost nova_compute[281288]: 2026-02-20 09:55:29.366 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.375 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:27Z, description=, device_id=cf3f380b-2884-4a30-946d-3fd2eacae5d3, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6aed2313-a15f-4b24-8da1-1ee435817e36, ip_allocation=immediate, mac_address=fa:16:3e:24:c2:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:25Z, description=, dns_domain=, id=30554b8e-97d6-457c-a559-c8a175beb267, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1940356692, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31716, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2052, status=ACTIVE, subnets=['7584d118-7e5d-407c-8050-c5c8e10e09f6'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:26Z, vlan_transparent=None, network_id=30554b8e-97d6-457c-a559-c8a175beb267, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2061, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:27Z on network 30554b8e-97d6-457c-a559-c8a175beb267#033[00m Feb 20 04:55:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.431 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:29 localhost podman[314920]: 2026-02-20 09:55:29.435249583 +0000 UTC m=+0.121787315 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal) Feb 20 04:55:29 localhost podman[314920]: 2026-02-20 09:55:29.450143576 +0000 UTC m=+0.136681308 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.7, distribution-scope=public, maintainer=Red Hat, Inc., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 04:55:29 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:55:29 localhost systemd[1]: tmp-crun.yQTdTo.mount: Deactivated successfully. Feb 20 04:55:29 localhost systemd[1]: var-lib-containers-storage-overlay-f7724903b31542ba17b88488ef9f53de08d696e33d16bf9339b383229babbfc0-merged.mount: Deactivated successfully. Feb 20 04:55:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:29 localhost systemd[1]: run-netns-qdhcp\x2d1569f18d\x2d1cf2\x2d4113\x2da1bd\x2d2d35906eb20f.mount: Deactivated successfully. Feb 20 04:55:29 localhost dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 1 addresses Feb 20 04:55:29 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host Feb 20 04:55:29 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts Feb 20 04:55:29 localhost podman[314958]: 2026-02-20 09:55:29.582852742 +0000 UTC m=+0.060127895 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:29 localhost ovn_controller[156798]: 2026-02-20T09:55:29Z|00254|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:55:29 localhost nova_compute[281288]: 2026-02-20 09:55:29.711 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.770 264355 INFO neutron.agent.dhcp.agent [None req-087c180b-137c-44aa-83fc-d3e7ea4c1690 - - - - - -] DHCP configuration for ports {'6aed2313-a15f-4b24-8da1-1ee435817e36'} is completed#033[00m Feb 20 04:55:30 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:30.324 2 INFO neutron.agent.securitygroups_rpc [None req-c4977344-b1fb-45a9-a725-767a5df232d2 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:31 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:31.049 2 INFO neutron.agent.securitygroups_rpc [None req-3c26b833-1dd4-4db0-a9c6-673823ae81db 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:31 localhost dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 0 addresses Feb 20 04:55:31 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host Feb 20 04:55:31 localhost dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts Feb 20 04:55:31 localhost podman[314997]: 2026-02-20 09:55:31.704035987 +0000 UTC m=+0.059995761 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:31 localhost kernel: device tapfa0c1fb0-9b left promiscuous mode Feb 20 04:55:31 localhost ovn_controller[156798]: 2026-02-20T09:55:31Z|00255|binding|INFO|Releasing lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a from this chassis (sb_readonly=0) Feb 20 04:55:31 localhost ovn_controller[156798]: 2026-02-20T09:55:31Z|00256|binding|INFO|Setting lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a down in Southbound Feb 20 04:55:31 localhost nova_compute[281288]: 2026-02-20 09:55:31.924 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:31.934 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb276814-fa37-4413-a869-338c3a114128, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fa0c1fb0-9b16-4f58-88ba-a30e77afed6a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:31.936 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fa0c1fb0-9b16-4f58-88ba-a30e77afed6a in datapath 30554b8e-97d6-457c-a559-c8a175beb267 unbound from our chassis#033[00m Feb 20 04:55:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:31.938 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30554b8e-97d6-457c-a559-c8a175beb267 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:31.939 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[be52e880-932b-4a2f-b8a1-ce327e20c0a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:31 localhost nova_compute[281288]: 2026-02-20 09:55:31.947 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:32 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:32.585 2 INFO neutron.agent.securitygroups_rpc [None req-2e3d6688-1793-41b9-a2f2-f815ee6132fe 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:32 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:32.964 2 INFO neutron.agent.securitygroups_rpc [None req-d888e083-a1eb-4717-8e36-0999aadb5157 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:33 localhost dnsmasq[314824]: exiting on receipt of SIGTERM Feb 20 04:55:33 localhost podman[315037]: 2026-02-20 09:55:33.539421051 +0000 UTC m=+0.063316862 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:55:33 localhost systemd[1]: libpod-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad.scope: Deactivated successfully. Feb 20 04:55:33 localhost podman[315050]: 2026-02-20 09:55:33.61519778 +0000 UTC m=+0.061676162 container died 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:33 localhost systemd[1]: tmp-crun.JfQgfZ.mount: Deactivated successfully. Feb 20 04:55:33 localhost podman[315050]: 2026-02-20 09:55:33.650946865 +0000 UTC m=+0.097425197 container cleanup 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:55:33 localhost systemd[1]: libpod-conmon-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad.scope: Deactivated successfully. Feb 20 04:55:33 localhost podman[315052]: 2026-02-20 09:55:33.70286034 +0000 UTC m=+0.137606956 container remove 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 20 04:55:33 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:33.905 264355 INFO neutron.agent.dhcp.agent [None req-2cd93e5e-e68a-4b2b-a8e7-7ff60c86ff36 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:34 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:34.063 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:34 localhost nova_compute[281288]: 2026-02-20 09:55:34.411 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:34 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:34.482 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:34 localhost systemd[1]: var-lib-containers-storage-overlay-285bed8d22a682481e23070c8097cc1e4b38ac43cbaebd1571bb61275146e81a-merged.mount: Deactivated successfully. Feb 20 04:55:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:34 localhost systemd[1]: run-netns-qdhcp\x2d30554b8e\x2d97d6\x2d457c\x2da559\x2dc8a175beb267.mount: Deactivated successfully. Feb 20 04:55:34 localhost ovn_controller[156798]: 2026-02-20T09:55:34Z|00257|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:55:34 localhost nova_compute[281288]: 2026-02-20 09:55:34.928 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:34 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:34.953 2 INFO neutron.agent.securitygroups_rpc [None req-3b974f9c-5163-47e8-89d2-00acf380ad82 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:55:36 localhost podman[315079]: 2026-02-20 09:55:36.165666029 +0000 UTC m=+0.097950923 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:55:36 localhost podman[315080]: 2026-02-20 09:55:36.242169539 +0000 UTC m=+0.170551295 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:36 localhost podman[315079]: 2026-02-20 09:55:36.266380614 +0000 UTC m=+0.198665468 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:55:36 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:55:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:36 localhost podman[315080]: 2026-02-20 09:55:36.320926289 +0000 UTC m=+0.249308055 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 20 04:55:36 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:55:37 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:37.529 2 INFO neutron.agent.securitygroups_rpc [None req-a4ab168f-7329-4461-8236-8f00fb1e3c92 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:37.562 264355 INFO neutron.agent.linux.ip_lib [None req-e46b29f8-41d0-4d97-ab45-2edba2f5ff37 - - - - - -] Device tapaf75ed32-13 cannot be used as it has no MAC address#033[00m Feb 20 04:55:37 localhost nova_compute[281288]: 2026-02-20 09:55:37.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:37 localhost kernel: device tapaf75ed32-13 entered promiscuous mode Feb 20 04:55:37 localhost NetworkManager[5988]: [1771581337.6317] manager: (tapaf75ed32-13): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Feb 20 04:55:37 localhost nova_compute[281288]: 2026-02-20 09:55:37.632 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:37 localhost ovn_controller[156798]: 2026-02-20T09:55:37Z|00258|binding|INFO|Claiming lport af75ed32-130d-4e8e-87f9-48ee296520f0 for this chassis. Feb 20 04:55:37 localhost ovn_controller[156798]: 2026-02-20T09:55:37Z|00259|binding|INFO|af75ed32-130d-4e8e-87f9-48ee296520f0: Claiming unknown Feb 20 04:55:37 localhost systemd-udevd[315133]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:37.642 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=876849fd-270a-4183-8f36-c602ceb68d2b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=af75ed32-130d-4e8e-87f9-48ee296520f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:37.645 162652 INFO neutron.agent.ovn.metadata.agent [-] Port af75ed32-130d-4e8e-87f9-48ee296520f0 in datapath 41ccba1b-d4dd-4580-8736-703a9b44e71b bound to our chassis#033[00m Feb 20 04:55:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:37.647 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 41ccba1b-d4dd-4580-8736-703a9b44e71b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:37.648 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4a38f513-f715-4bf8-9c80-524ddb23b111]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost ovn_controller[156798]: 2026-02-20T09:55:37Z|00260|binding|INFO|Setting lport af75ed32-130d-4e8e-87f9-48ee296520f0 ovn-installed in OVS Feb 20 04:55:37 localhost ovn_controller[156798]: 2026-02-20T09:55:37Z|00261|binding|INFO|Setting lport af75ed32-130d-4e8e-87f9-48ee296520f0 up in Southbound Feb 20 04:55:37 localhost nova_compute[281288]: 2026-02-20 09:55:37.676 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost journal[229984]: ethtool ioctl error on tapaf75ed32-13: No such device Feb 20 04:55:37 localhost nova_compute[281288]: 2026-02-20 09:55:37.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:37 localhost nova_compute[281288]: 2026-02-20 09:55:37.762 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:38 localhost podman[315204]: Feb 20 04:55:38 localhost podman[315204]: 2026-02-20 09:55:38.595960602 +0000 UTC m=+0.072024586 container create 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:55:38 localhost systemd[1]: Started libpod-conmon-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c.scope. Feb 20 04:55:38 localhost systemd[1]: Started libcrun container. Feb 20 04:55:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98a11e8164b248f48392f782c6ee5ccf0ba56d6a01fd60072357a42bcea10a93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:38 localhost podman[315204]: 2026-02-20 09:55:38.557942999 +0000 UTC m=+0.034006983 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:38 localhost podman[315204]: 2026-02-20 09:55:38.66114217 +0000 UTC m=+0.137206134 container init 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:38 localhost podman[315204]: 2026-02-20 09:55:38.672162384 +0000 UTC m=+0.148226338 container start 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:38 localhost dnsmasq[315222]: started, version 2.85 cachesize 150 Feb 20 04:55:38 localhost dnsmasq[315222]: DNS service limited to local subnets Feb 20 04:55:38 localhost dnsmasq[315222]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:38 localhost dnsmasq[315222]: warning: no upstream servers configured Feb 20 04:55:38 localhost dnsmasq-dhcp[315222]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:55:38 localhost dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 0 addresses Feb 20 04:55:38 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host Feb 20 04:55:38 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts Feb 20 04:55:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:38.750 264355 INFO neutron.agent.dhcp.agent [None req-a9886a1c-647f-4ea8-a821-7fccb7c50cbd - - - - - -] DHCP configuration for ports {'1d5a0285-6e9b-4ceb-bbc6-a537f6efba7b'} is completed#033[00m Feb 20 04:55:38 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:38.905 2 INFO neutron.agent.securitygroups_rpc [None req-eb4533df-2883-48ce-b913-efeaaf3f9e10 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:39 localhost nova_compute[281288]: 2026-02-20 09:55:39.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:39.880 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:39Z, description=, device_id=a5051175-9093-4a92-8a71-099c941b94b5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=adcb09ab-6dda-431c-9b20-ce19a23f1bc4, ip_allocation=immediate, mac_address=fa:16:3e:40:5f:b9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:35Z, description=, dns_domain=, id=41ccba1b-d4dd-4580-8736-703a9b44e71b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1367145819, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40210, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2091, status=ACTIVE, subnets=['cc518582-23ac-498e-b92e-77fca38ee666'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:36Z, vlan_transparent=None, network_id=41ccba1b-d4dd-4580-8736-703a9b44e71b, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2123, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:39Z on network 41ccba1b-d4dd-4580-8736-703a9b44e71b#033[00m Feb 20 04:55:40 localhost podman[315292]: 2026-02-20 09:55:40.117579547 +0000 UTC m=+0.070489970 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:40 localhost dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 1 addresses Feb 20 04:55:40 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host Feb 20 04:55:40 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts Feb 20 04:55:40 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:40.413 264355 INFO neutron.agent.dhcp.agent [None req-efc687d8-befb-434a-9a23-9fa8c0641eb7 - - - - - -] DHCP configuration for ports {'adcb09ab-6dda-431c-9b20-ce19a23f1bc4'} is completed#033[00m Feb 20 04:55:40 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:55:40 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:55:40 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:40.623 2 INFO neutron.agent.securitygroups_rpc [None req-7a51814a-d3bb-4d9f-a2cf-a8e1904feac9 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:41.004 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:39Z, description=, device_id=a5051175-9093-4a92-8a71-099c941b94b5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=adcb09ab-6dda-431c-9b20-ce19a23f1bc4, ip_allocation=immediate, mac_address=fa:16:3e:40:5f:b9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:35Z, description=, dns_domain=, id=41ccba1b-d4dd-4580-8736-703a9b44e71b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1367145819, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40210, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2091, status=ACTIVE, subnets=['cc518582-23ac-498e-b92e-77fca38ee666'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:36Z, vlan_transparent=None, network_id=41ccba1b-d4dd-4580-8736-703a9b44e71b, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2123, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:39Z on network 41ccba1b-d4dd-4580-8736-703a9b44e71b#033[00m Feb 20 04:55:41 localhost dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 1 addresses Feb 20 04:55:41 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host Feb 20 04:55:41 localhost podman[315362]: 2026-02-20 09:55:41.223199111 +0000 UTC m=+0.058704582 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:41 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts Feb 20 04:55:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:41.468 264355 INFO neutron.agent.dhcp.agent [None req-b1c870c0-549d-463a-8614-fad00a2b7163 - - - - - -] DHCP configuration for ports {'adcb09ab-6dda-431c-9b20-ce19a23f1bc4'} is completed#033[00m Feb 20 04:55:41 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:41.673 2 INFO neutron.agent.securitygroups_rpc [None req-45da2e31-24b0-4a9e-8fbd-b3074d839731 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:41.704 264355 INFO neutron.agent.linux.ip_lib [None req-1485b343-de9e-4d46-8292-a63df142710c - - - - - -] Device tap4a535ed2-00 cannot be used as it has no MAC address#033[00m Feb 20 04:55:41 localhost nova_compute[281288]: 2026-02-20 09:55:41.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:41 localhost kernel: device tap4a535ed2-00 entered promiscuous mode Feb 20 04:55:41 localhost NetworkManager[5988]: [1771581341.7371] manager: (tap4a535ed2-00): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Feb 20 04:55:41 localhost ovn_controller[156798]: 2026-02-20T09:55:41Z|00262|binding|INFO|Claiming lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 for this chassis. Feb 20 04:55:41 localhost ovn_controller[156798]: 2026-02-20T09:55:41Z|00263|binding|INFO|4a535ed2-00be-4ec7-8d9e-24afdab13877: Claiming unknown Feb 20 04:55:41 localhost nova_compute[281288]: 2026-02-20 09:55:41.738 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:41 localhost systemd-udevd[315392]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:41.750 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2f3d8db-7c26-4898-8e61-0b8ba04044df, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4a535ed2-00be-4ec7-8d9e-24afdab13877) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:41.753 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4a535ed2-00be-4ec7-8d9e-24afdab13877 in datapath cd45d753-df49-44f6-b419-6749d7fe84f3 bound to our chassis#033[00m Feb 20 04:55:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:41.755 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cd45d753-df49-44f6-b419-6749d7fe84f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:41.756 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1d428b70-87cb-4b9b-9266-304615394891]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost ovn_controller[156798]: 2026-02-20T09:55:41Z|00264|binding|INFO|Setting lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 ovn-installed in OVS Feb 20 04:55:41 localhost ovn_controller[156798]: 2026-02-20T09:55:41Z|00265|binding|INFO|Setting lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 up in Southbound Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost nova_compute[281288]: 2026-02-20 09:55:41.780 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost journal[229984]: ethtool ioctl error on tap4a535ed2-00: No such device Feb 20 04:55:41 localhost nova_compute[281288]: 2026-02-20 09:55:41.817 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:41 localhost nova_compute[281288]: 2026-02-20 09:55:41.845 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:42 localhost podman[315463]: Feb 20 04:55:42 localhost podman[315463]: 2026-02-20 09:55:42.59008359 +0000 UTC m=+0.074162030 container create a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 20 04:55:42 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e150 e150: 6 total, 6 up, 6 in Feb 20 04:55:42 localhost systemd[1]: Started libpod-conmon-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d.scope. Feb 20 04:55:42 localhost systemd[1]: tmp-crun.3FE9Zn.mount: Deactivated successfully. Feb 20 04:55:42 localhost podman[315463]: 2026-02-20 09:55:42.557210883 +0000 UTC m=+0.041289333 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:42 localhost systemd[1]: Started libcrun container. Feb 20 04:55:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a8113fcfd1358b3db8a5596a585afb795b665e94caae01a8c63b1b24d0a203b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:42 localhost podman[315463]: 2026-02-20 09:55:42.676800881 +0000 UTC m=+0.160879291 container init a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:42 localhost podman[315463]: 2026-02-20 09:55:42.683205605 +0000 UTC m=+0.167284015 container start a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:42 localhost dnsmasq[315482]: started, version 2.85 cachesize 150 Feb 20 04:55:42 localhost dnsmasq[315482]: DNS service limited to local subnets Feb 20 04:55:42 localhost dnsmasq[315482]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:42 localhost dnsmasq[315482]: warning: no upstream servers configured Feb 20 04:55:42 localhost dnsmasq-dhcp[315482]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Feb 20 04:55:42 localhost dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 0 addresses Feb 20 04:55:42 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host Feb 20 04:55:42 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts Feb 20 04:55:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:42.750 264355 INFO neutron.agent.dhcp.agent [None req-1485b343-de9e-4d46-8292-a63df142710c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:41Z, description=, device_id=7f90544d-5ad8-4f1b-8f43-7851902677f5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b3a465d7-0fe1-4076-a907-134fc38d292d, ip_allocation=immediate, mac_address=fa:16:3e:54:61:36, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:39Z, description=, dns_domain=, id=cd45d753-df49-44f6-b419-6749d7fe84f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-181663761, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61227, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2126, status=ACTIVE, subnets=['8ee7ce1e-019f-46a3-86af-2fedd3583e02'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:40Z, vlan_transparent=None, network_id=cd45d753-df49-44f6-b419-6749d7fe84f3, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2136, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:41Z on network cd45d753-df49-44f6-b419-6749d7fe84f3#033[00m Feb 20 04:55:42 localhost sshd[315502]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:55:42 localhost dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 1 addresses Feb 20 04:55:42 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host Feb 20 04:55:42 localhost podman[315501]: 2026-02-20 09:55:42.888394721 +0000 UTC m=+0.041446978 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:55:42 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts Feb 20 04:55:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:42.911 264355 INFO neutron.agent.dhcp.agent [None req-7d4a75c5-73c2-4ee7-b46f-db483350b090 - - - - - -] DHCP configuration for ports {'74f77219-8820-4956-b74f-503a881f77fb'} is completed#033[00m Feb 20 04:55:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:55:43 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:43.047 264355 INFO neutron.agent.dhcp.agent [None req-b5a2ed51-0a74-4077-8f10-9b935b39d47b - - - - - -] DHCP configuration for ports {'b3a465d7-0fe1-4076-a907-134fc38d292d'} is completed#033[00m Feb 20 04:55:43 localhost podman[315525]: 2026-02-20 09:55:43.14547189 +0000 UTC m=+0.086634069 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute) Feb 20 04:55:43 localhost podman[315525]: 2026-02-20 09:55:43.162086454 +0000 UTC m=+0.103248633 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:43 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:55:43 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:43.313 2 INFO neutron.agent.securitygroups_rpc [None req-b9b9f9d2-32dc-4ef0-9ebd-12b2d2ae4ee8 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:43 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:55:43 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e151 e151: 6 total, 6 up, 6 in Feb 20 04:55:43 localhost nova_compute[281288]: 2026-02-20 09:55:43.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:43 localhost nova_compute[281288]: 2026-02-20 09:55:43.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 04:55:44 localhost dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 0 addresses Feb 20 04:55:44 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host Feb 20 04:55:44 localhost podman[315560]: 2026-02-20 09:55:44.158851365 +0000 UTC m=+0.066380655 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:55:44 localhost dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts Feb 20 04:55:44 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:44.159 2 INFO neutron.agent.securitygroups_rpc [None req-6065317f-9707-4b49-a2f9-16f062a24577 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']#033[00m Feb 20 04:55:44 localhost ovn_controller[156798]: 2026-02-20T09:55:44Z|00266|binding|INFO|Releasing lport af75ed32-130d-4e8e-87f9-48ee296520f0 from this chassis (sb_readonly=0) Feb 20 04:55:44 localhost ovn_controller[156798]: 2026-02-20T09:55:44Z|00267|binding|INFO|Setting lport af75ed32-130d-4e8e-87f9-48ee296520f0 down in Southbound Feb 20 04:55:44 localhost kernel: device tapaf75ed32-13 left promiscuous mode Feb 20 04:55:44 localhost nova_compute[281288]: 2026-02-20 09:55:44.383 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:44.386 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:41Z, description=, device_id=7f90544d-5ad8-4f1b-8f43-7851902677f5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b3a465d7-0fe1-4076-a907-134fc38d292d, ip_allocation=immediate, mac_address=fa:16:3e:54:61:36, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:39Z, description=, dns_domain=, id=cd45d753-df49-44f6-b419-6749d7fe84f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-181663761, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61227, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2126, status=ACTIVE, subnets=['8ee7ce1e-019f-46a3-86af-2fedd3583e02'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:40Z, vlan_transparent=None, network_id=cd45d753-df49-44f6-b419-6749d7fe84f3, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2136, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:41Z on network cd45d753-df49-44f6-b419-6749d7fe84f3#033[00m Feb 20 04:55:44 localhost nova_compute[281288]: 2026-02-20 09:55:44.405 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:44.403 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=876849fd-270a-4183-8f36-c602ceb68d2b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=af75ed32-130d-4e8e-87f9-48ee296520f0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:44.406 162652 INFO neutron.agent.ovn.metadata.agent [-] Port af75ed32-130d-4e8e-87f9-48ee296520f0 in datapath 41ccba1b-d4dd-4580-8736-703a9b44e71b unbound from our chassis#033[00m Feb 20 04:55:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:44.409 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41ccba1b-d4dd-4580-8736-703a9b44e71b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:55:44 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:44.410 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c2510a50-4f86-4615-9571-5c25b9ddb593]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:44 localhost nova_compute[281288]: 2026-02-20 09:55:44.460 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:44 localhost dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 1 addresses Feb 20 04:55:44 localhost podman[315601]: 2026-02-20 09:55:44.594696518 +0000 UTC m=+0.060556399 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:44 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host Feb 20 04:55:44 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts Feb 20 04:55:44 localhost nova_compute[281288]: 2026-02-20 09:55:44.826 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:44 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:44.895 264355 INFO neutron.agent.dhcp.agent [None req-446cb246-573f-40f5-9b5c-a84bb73bc1bc - - - - - -] DHCP configuration for ports {'b3a465d7-0fe1-4076-a907-134fc38d292d'} is completed#033[00m Feb 20 04:55:45 localhost dnsmasq[315222]: exiting on receipt of SIGTERM Feb 20 04:55:45 localhost systemd[1]: libpod-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c.scope: Deactivated successfully. Feb 20 04:55:45 localhost podman[315639]: 2026-02-20 09:55:45.195714112 +0000 UTC m=+0.062569019 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:45 localhost podman[315651]: 2026-02-20 09:55:45.266769008 +0000 UTC m=+0.059162006 container died 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:55:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c-userdata-shm.mount: Deactivated successfully. Feb 20 04:55:45 localhost podman[315651]: 2026-02-20 09:55:45.303162092 +0000 UTC m=+0.095555040 container cleanup 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:45 localhost systemd[1]: libpod-conmon-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c.scope: Deactivated successfully. Feb 20 04:55:45 localhost podman[315653]: 2026-02-20 09:55:45.355567631 +0000 UTC m=+0.137386129 container remove 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 20 04:55:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:45.387 264355 INFO neutron.agent.dhcp.agent [None req-61f6c2fa-305a-4b01-bf8f-36b14b468a72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:45 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:45.388 264355 INFO neutron.agent.dhcp.agent [None req-61f6c2fa-305a-4b01-bf8f-36b14b468a72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:55:45 localhost ovn_controller[156798]: 2026-02-20T09:55:45Z|00268|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:55:45 localhost nova_compute[281288]: 2026-02-20 09:55:45.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:46 localhost systemd[1]: var-lib-containers-storage-overlay-98a11e8164b248f48392f782c6ee5ccf0ba56d6a01fd60072357a42bcea10a93-merged.mount: Deactivated successfully. Feb 20 04:55:46 localhost systemd[1]: run-netns-qdhcp\x2d41ccba1b\x2dd4dd\x2d4580\x2d8736\x2d703a9b44e71b.mount: Deactivated successfully. Feb 20 04:55:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:47.155 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:47.157 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:55:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:47.160 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:55:47 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:47.161 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[e92f375c-9ce6-47af-bc98-b68cdb489ca1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:47 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:47.487 2 INFO neutron.agent.securitygroups_rpc [None req-4aebfeae-a91f-4757-b68b-fe43601c173b 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:47 localhost podman[241968]: time="2026-02-20T09:55:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:55:47 localhost podman[241968]: @ - - [20/Feb/2026:09:55:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157076 "" "Go-http-client/1.1" Feb 20 04:55:47 localhost podman[241968]: @ - - [20/Feb/2026:09:55:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18807 "" "Go-http-client/1.1" Feb 20 04:55:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 e152: 6 total, 6 up, 6 in Feb 20 04:55:48 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:48.616 2 INFO neutron.agent.securitygroups_rpc [None req-f40f35c0-4148-4e05-a7c2-3455638d7684 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:48 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:48.992 2 INFO neutron.agent.securitygroups_rpc [None req-b8e91bcb-8145-4128-b349-c2aa5e79d87e 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:49 localhost nova_compute[281288]: 2026-02-20 09:55:49.497 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:49.515 264355 INFO neutron.agent.linux.ip_lib [None req-5508e5c9-f288-401d-b9b9-5d3584cd4c63 - - - - - -] Device tap36eed41e-8c cannot be used as it has no MAC address#033[00m Feb 20 04:55:49 localhost nova_compute[281288]: 2026-02-20 09:55:49.530 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:49 localhost kernel: device tap36eed41e-8c entered promiscuous mode Feb 20 04:55:49 localhost nova_compute[281288]: 2026-02-20 09:55:49.536 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:49 localhost NetworkManager[5988]: [1771581349.5394] manager: (tap36eed41e-8c): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Feb 20 04:55:49 localhost ovn_controller[156798]: 2026-02-20T09:55:49Z|00269|binding|INFO|Claiming lport 36eed41e-8c87-4e39-8638-1f088c4d480e for this chassis. Feb 20 04:55:49 localhost ovn_controller[156798]: 2026-02-20T09:55:49Z|00270|binding|INFO|36eed41e-8c87-4e39-8638-1f088c4d480e: Claiming unknown Feb 20 04:55:49 localhost systemd-udevd[315691]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:49.547 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24080264-67e8-4d16-abbb-0767714bc8ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=36eed41e-8c87-4e39-8638-1f088c4d480e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:49.549 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 36eed41e-8c87-4e39-8638-1f088c4d480e in datapath 7d64da0e-050b-4b53-8861-874f3c3ef083 bound to our chassis#033[00m Feb 20 04:55:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:49.551 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port c9940588-ae7f-4658-b111-18cf96086819 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:55:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:49.551 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d64da0e-050b-4b53-8861-874f3c3ef083, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:55:49 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:49.552 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2b89a3-6560-4fe0-a8cb-9adb7fa4c5f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost ovn_controller[156798]: 2026-02-20T09:55:49Z|00271|binding|INFO|Setting lport 36eed41e-8c87-4e39-8638-1f088c4d480e ovn-installed in OVS Feb 20 04:55:49 localhost ovn_controller[156798]: 2026-02-20T09:55:49Z|00272|binding|INFO|Setting lport 36eed41e-8c87-4e39-8638-1f088c4d480e up in Southbound Feb 20 04:55:49 localhost nova_compute[281288]: 2026-02-20 09:55:49.570 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost journal[229984]: ethtool ioctl error on tap36eed41e-8c: No such device Feb 20 04:55:49 localhost nova_compute[281288]: 2026-02-20 09:55:49.599 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:49 localhost nova_compute[281288]: 2026-02-20 09:55:49.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:50 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:50.051 2 INFO neutron.agent.securitygroups_rpc [None req-005508de-f69c-4da2-9936-4351a4d76fde 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:50 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:50.175 2 INFO neutron.agent.securitygroups_rpc [None req-8c2a2fb8-3a21-4d90-b744-6c75dba74fae f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:50 localhost podman[315762]: Feb 20 04:55:50 localhost podman[315762]: 2026-02-20 09:55:50.468980469 +0000 UTC m=+0.092251720 container create 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:55:50 localhost systemd[1]: Started libpod-conmon-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8.scope. Feb 20 04:55:50 localhost podman[315762]: 2026-02-20 09:55:50.422022364 +0000 UTC m=+0.045293645 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:50 localhost systemd[1]: Started libcrun container. Feb 20 04:55:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e2bd54cd7424e1b7846b66bd6e6847e221583891328681be5cb3316924e217c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:50 localhost podman[315762]: 2026-02-20 09:55:50.553218734 +0000 UTC m=+0.176490045 container init 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 20 04:55:50 localhost podman[315762]: 2026-02-20 09:55:50.564505176 +0000 UTC m=+0.187776437 container start 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:55:50 localhost dnsmasq[315780]: started, version 2.85 cachesize 150 Feb 20 04:55:50 localhost dnsmasq[315780]: DNS service limited to local subnets Feb 20 04:55:50 localhost dnsmasq[315780]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:50 localhost dnsmasq[315780]: warning: no upstream servers configured Feb 20 04:55:50 localhost dnsmasq-dhcp[315780]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:55:50 localhost dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 0 addresses Feb 20 04:55:50 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host Feb 20 04:55:50 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts Feb 20 04:55:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:50.623 264355 INFO neutron.agent.dhcp.agent [None req-20df167f-7f80-49d5-8aed-c087e176ca79 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:49Z, description=, device_id=40707009-5dc5-44c2-8d25-acba20c2e4ac, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3ba2d20e-026f-45bd-b5aa-4b36de93e613, ip_allocation=immediate, mac_address=fa:16:3e:9a:3d:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:46Z, description=, dns_domain=, id=7d64da0e-050b-4b53-8861-874f3c3ef083, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-38954291, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37042, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2154, status=ACTIVE, subnets=['0c6212d7-4953-4ba5-8041-3a0436e8149b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:47Z, vlan_transparent=None, network_id=7d64da0e-050b-4b53-8861-874f3c3ef083, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2176, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:49Z on network 7d64da0e-050b-4b53-8861-874f3c3ef083#033[00m Feb 20 04:55:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:50.782 264355 INFO neutron.agent.dhcp.agent [None req-4295fb46-a102-499c-826c-250329e900f8 - - - - - -] DHCP configuration for ports {'b804fc28-6e7e-40dc-ac07-67ec6e857bfe'} is completed#033[00m Feb 20 04:55:50 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:50.912 2 INFO neutron.agent.securitygroups_rpc [None req-8cf476ac-f2bb-4715-a401-741101924898 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:50 localhost dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 1 addresses Feb 20 04:55:50 localhost podman[315798]: 2026-02-20 09:55:50.948969751 +0000 UTC m=+0.060597120 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:55:50 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host Feb 20 04:55:50 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts Feb 20 04:55:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:51.235 264355 INFO neutron.agent.dhcp.agent [None req-d41ba51e-a2b4-40d5-8fd2-3dd95a0ca548 - - - - - -] DHCP configuration for ports {'3ba2d20e-026f-45bd-b5aa-4b36de93e613'} is completed#033[00m Feb 20 04:55:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:52 localhost nova_compute[281288]: 2026-02-20 09:55:52.136 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:52 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:52.162 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:49Z, description=, device_id=40707009-5dc5-44c2-8d25-acba20c2e4ac, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3ba2d20e-026f-45bd-b5aa-4b36de93e613, ip_allocation=immediate, mac_address=fa:16:3e:9a:3d:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:46Z, description=, dns_domain=, id=7d64da0e-050b-4b53-8861-874f3c3ef083, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-38954291, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37042, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2154, status=ACTIVE, subnets=['0c6212d7-4953-4ba5-8041-3a0436e8149b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:47Z, vlan_transparent=None, network_id=7d64da0e-050b-4b53-8861-874f3c3ef083, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2176, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:49Z on network 7d64da0e-050b-4b53-8861-874f3c3ef083#033[00m Feb 20 04:55:52 localhost dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 1 addresses Feb 20 04:55:52 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host Feb 20 04:55:52 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts Feb 20 04:55:52 localhost podman[315837]: 2026-02-20 09:55:52.370754357 +0000 UTC m=+0.056160504 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:52 localhost nova_compute[281288]: 2026-02-20 09:55:52.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:52 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:52.575 264355 INFO neutron.agent.dhcp.agent [None req-a0869f93-dc3b-4277-9bd9-4d40b4467b17 - - - - - -] DHCP configuration for ports {'3ba2d20e-026f-45bd-b5aa-4b36de93e613'} is completed#033[00m Feb 20 04:55:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:52.617 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:52.619 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:55:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:52.622 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:55:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:52.623 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc348c9-3d8e-4abd-94cb-6509f2c6e23f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:55:53 localhost podman[315859]: 2026-02-20 09:55:53.143138959 +0000 UTC m=+0.084152614 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:55:53 localhost podman[315859]: 2026-02-20 09:55:53.15997719 +0000 UTC m=+0.100990855 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:55:53 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:55:54 localhost nova_compute[281288]: 2026-02-20 09:55:54.526 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:54 localhost nova_compute[281288]: 2026-02-20 09:55:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:55 localhost nova_compute[281288]: 2026-02-20 09:55:55.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:55 localhost nova_compute[281288]: 2026-02-20 09:55:55.770 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:55:55 localhost nova_compute[281288]: 2026-02-20 09:55:55.771 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:55:55 localhost nova_compute[281288]: 2026-02-20 09:55:55.771 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:55:55 localhost nova_compute[281288]: 2026-02-20 09:55:55.772 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:55:55 localhost nova_compute[281288]: 2026-02-20 09:55:55.772 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:55:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:56.004 2 INFO neutron.agent.securitygroups_rpc [None req-29e4bc85-2be1-46ee-a8e4-a169ea695f47 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.026 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.028 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.033 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.034 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[80525b26-302b-4de2-8a53-15c0da1d6682]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:55:56 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1770512086' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.260 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:55:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.342 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.343 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:55:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:56.459 264355 INFO neutron.agent.linux.ip_lib [None req-f02ad7d0-6a8a-4737-b433-d3516658bae2 - - - - - -] Device tap7fb0a486-a6 cannot be used as it has no MAC address#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:56 localhost kernel: device tap7fb0a486-a6 entered promiscuous mode Feb 20 04:55:56 localhost NetworkManager[5988]: [1771581356.4864] manager: (tap7fb0a486-a6): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.489 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:56 localhost ovn_controller[156798]: 2026-02-20T09:55:56Z|00273|binding|INFO|Claiming lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 for this chassis. Feb 20 04:55:56 localhost ovn_controller[156798]: 2026-02-20T09:55:56Z|00274|binding|INFO|7fb0a486-a66c-4d37-81fc-183b9c067f22: Claiming unknown Feb 20 04:55:56 localhost systemd-udevd[315914]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.500 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a09b3407c3143c8ae0948ccb18c1e61', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9d9638a-132c-434c-893e-fbb0a0a85486, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7fb0a486-a66c-4d37-81fc-183b9c067f22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.502 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fb0a486-a66c-4d37-81fc-183b9c067f22 in datapath 477c60d5-1ced-40c8-b389-807eea4d8a62 bound to our chassis#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.504 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 477c60d5-1ced-40c8-b389-807eea4d8a62 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.505 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[feff1754-0592-4af4-885c-9bfc920581fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost ovn_controller[156798]: 2026-02-20T09:55:56Z|00275|binding|INFO|Setting lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 ovn-installed in OVS Feb 20 04:55:56 localhost ovn_controller[156798]: 2026-02-20T09:55:56Z|00276|binding|INFO|Setting lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 up in Southbound Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.522 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:56.532 2 INFO neutron.agent.securitygroups_rpc [None req-35ffc450-9844-472b-bd23-e1de49029696 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost journal[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.550 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.551 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11323MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.551 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.552 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.554 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:56 localhost openstack_network_exporter[244414]: ERROR 09:55:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:55:56 localhost openstack_network_exporter[244414]: Feb 20 04:55:56 localhost openstack_network_exporter[244414]: ERROR 09:55:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:55:56 localhost openstack_network_exporter[244414]: Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.586 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.727 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.744 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:55:56 localhost nova_compute[281288]: 2026-02-20 09:55:56.745 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.746 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:55:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:55:56.748 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:55:56 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:56.986 2 INFO neutron.agent.securitygroups_rpc [None req-add9b10e-24c8-47e6-9727-38256205ffd5 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:55:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:55:57 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/438322960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:55:57 localhost systemd[1]: tmp-crun.M2yvpP.mount: Deactivated successfully. Feb 20 04:55:57 localhost nova_compute[281288]: 2026-02-20 09:55:57.165 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:55:57 localhost podman[315983]: 2026-02-20 09:55:57.164991219 +0000 UTC m=+0.102188512 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:55:57 localhost nova_compute[281288]: 2026-02-20 09:55:57.172 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:55:57 localhost nova_compute[281288]: 2026-02-20 09:55:57.193 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:55:57 localhost nova_compute[281288]: 2026-02-20 09:55:57.194 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:55:57 localhost nova_compute[281288]: 2026-02-20 09:55:57.195 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:55:57 localhost podman[315983]: 2026-02-20 09:55:57.201988301 +0000 UTC m=+0.139185604 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:55:57 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:55:57 localhost podman[316030]: Feb 20 04:55:57 localhost podman[316030]: 2026-02-20 09:55:57.319246739 +0000 UTC m=+0.069723837 container create 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:55:57 localhost systemd[1]: Started libpod-conmon-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01.scope. Feb 20 04:55:57 localhost podman[316030]: 2026-02-20 09:55:57.278209973 +0000 UTC m=+0.028687111 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:55:57 localhost systemd[1]: Started libcrun container. Feb 20 04:55:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f8187cf9303c1f059ae001862a85201eb317f04d51767ad21701d4f209390a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:55:57 localhost podman[316030]: 2026-02-20 09:55:57.394664967 +0000 UTC m=+0.145142065 container init 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:55:57 localhost podman[316030]: 2026-02-20 09:55:57.403757583 +0000 UTC m=+0.154234681 container start 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:55:57 localhost dnsmasq[316049]: started, version 2.85 cachesize 150 Feb 20 04:55:57 localhost dnsmasq[316049]: DNS service limited to local subnets Feb 20 04:55:57 localhost dnsmasq[316049]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:55:57 localhost dnsmasq[316049]: warning: no upstream servers configured Feb 20 04:55:57 localhost dnsmasq-dhcp[316049]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:55:57 localhost dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 0 addresses Feb 20 04:55:57 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host Feb 20 04:55:57 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts Feb 20 04:55:57 localhost nova_compute[281288]: 2026-02-20 09:55:57.425 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:57.467 264355 INFO neutron.agent.dhcp.agent [None req-f02ad7d0-6a8a-4737-b433-d3516658bae2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb, ip_allocation=immediate, mac_address=fa:16:3e:2d:41:23, name=tempest-PortsIpV6TestJSON-708453411, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:52Z, description=, dns_domain=, id=477c60d5-1ced-40c8-b389-807eea4d8a62, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1206391875, port_security_enabled=True, project_id=3a09b3407c3143c8ae0948ccb18c1e61, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27615, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2181, status=ACTIVE, subnets=['005984a6-885d-4a90-94c5-fdf65af6044a'], tags=[], tenant_id=3a09b3407c3143c8ae0948ccb18c1e61, updated_at=2026-02-20T09:55:54Z, vlan_transparent=None, network_id=477c60d5-1ced-40c8-b389-807eea4d8a62, port_security_enabled=True, project_id=3a09b3407c3143c8ae0948ccb18c1e61, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b2e5856c-f1df-4bbc-8f9c-41698aa249c6'], standard_attr_id=2213, status=DOWN, tags=[], tenant_id=3a09b3407c3143c8ae0948ccb18c1e61, updated_at=2026-02-20T09:55:56Z on network 477c60d5-1ced-40c8-b389-807eea4d8a62#033[00m Feb 20 04:55:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:57.542 264355 INFO neutron.agent.dhcp.agent [None req-f2ffb705-493e-40d9-a18a-d200026d8a4c - - - - - -] DHCP configuration for ports {'0e3d4067-022e-417e-90d3-44d8efd32e90'} is completed#033[00m Feb 20 04:55:57 localhost dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 1 addresses Feb 20 04:55:57 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host Feb 20 04:55:57 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts Feb 20 04:55:57 localhost podman[316066]: 2026-02-20 09:55:57.65583663 +0000 UTC m=+0.056983009 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:55:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:55:57.928 264355 INFO neutron.agent.dhcp.agent [None req-44f0482b-cbae-46f2-8704-a715503832e3 - - - - - -] DHCP configuration for ports {'24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb'} is completed#033[00m Feb 20 04:55:58 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:58.482 2 INFO neutron.agent.securitygroups_rpc [None req-a5703a13-6375-4e7f-aba2-f531a9b12f0a f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:55:58 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:58.863 2 INFO neutron.agent.securitygroups_rpc [None req-ca73f77f-8256-4f4e-b317-bc2e72fd527f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.177 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.177 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:55:59 localhost neutron_sriov_agent[257177]: 2026-02-20 09:55:59.670 2 INFO neutron.agent.securitygroups_rpc [None req-ca73f77f-8256-4f4e-b317-bc2e72fd527f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.741 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:55:59 localhost nova_compute[281288]: 2026-02-20 09:55:59.818 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:56:00 localhost podman[316086]: 2026-02-20 09:56:00.141753761 +0000 UTC m=+0.080279187 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc.) Feb 20 04:56:00 localhost podman[316086]: 2026-02-20 09:56:00.154666642 +0000 UTC m=+0.093192098 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, release=1770267347, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7) Feb 20 04:56:00 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:56:00 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:00.493 2 INFO neutron.agent.securitygroups_rpc [None req-78688fc9-1f65-4ea8-8870-27e8d247cb32 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:00 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:00.586 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:56Z, description=, device_id=e10ccfe8-c211-4168-9be0-1240dc9757c2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb, ip_allocation=immediate, mac_address=fa:16:3e:2d:41:23, name=tempest-PortsIpV6TestJSON-708453411, network_id=477c60d5-1ced-40c8-b389-807eea4d8a62, port_security_enabled=True, project_id=3a09b3407c3143c8ae0948ccb18c1e61, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['b2e5856c-f1df-4bbc-8f9c-41698aa249c6'], standard_attr_id=2213, status=ACTIVE, tags=[], tenant_id=3a09b3407c3143c8ae0948ccb18c1e61, updated_at=2026-02-20T09:55:58Z on network 477c60d5-1ced-40c8-b389-807eea4d8a62#033[00m Feb 20 04:56:00 localhost nova_compute[281288]: 2026-02-20 09:56:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:00 localhost dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 1 addresses Feb 20 04:56:00 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host Feb 20 04:56:00 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts Feb 20 04:56:00 localhost podman[316124]: 2026-02-20 09:56:00.786541443 +0000 UTC m=+0.072091229 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:01 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:01.172 264355 INFO neutron.agent.dhcp.agent [None req-09304e15-4b73-489c-886b-10c76f865860 - - - - - -] DHCP configuration for ports {'24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb'} is completed#033[00m Feb 20 04:56:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:01 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:01.576 2 INFO neutron.agent.securitygroups_rpc [None req-a4e037d0-314c-4de3-aad9-537a96cc703d 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1014543227' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1014543227' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:02 localhost nova_compute[281288]: 2026-02-20 09:56:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:02 localhost nova_compute[281288]: 2026-02-20 09:56:02.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:56:03 localhost sshd[316146]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.823 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.824 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.824 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:56:03 localhost nova_compute[281288]: 2026-02-20 09:56:03.825 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:56:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:03.894 2 INFO neutron.agent.securitygroups_rpc [None req-7d0bf5e0-9e1d-414c-8190-249e450828ca 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:56:04 localhost dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 0 addresses Feb 20 04:56:04 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host Feb 20 04:56:04 localhost dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts Feb 20 04:56:04 localhost podman[316165]: 2026-02-20 09:56:04.15538023 +0000 UTC m=+0.080017158 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.193 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.195 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.199 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.200 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5d56de93-b2bf-4df5-a223-4db45aaeeec4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:04 localhost kernel: device tap7fb0a486-a6 left promiscuous mode Feb 20 04:56:04 localhost ovn_controller[156798]: 2026-02-20T09:56:04Z|00277|binding|INFO|Releasing lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 from this chassis (sb_readonly=0) Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.373 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:04 localhost ovn_controller[156798]: 2026-02-20T09:56:04Z|00278|binding|INFO|Setting lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 down in Southbound Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.384 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a09b3407c3143c8ae0948ccb18c1e61', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9d9638a-132c-434c-893e-fbb0a0a85486, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7fb0a486-a66c-4d37-81fc-183b9c067f22) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.386 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fb0a486-a66c-4d37-81fc-183b9c067f22 in datapath 477c60d5-1ced-40c8-b389-807eea4d8a62 unbound from our chassis#033[00m Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.388 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 477c60d5-1ced-40c8-b389-807eea4d8a62 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:04.390 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[26979788-d58f-4efa-bfd3-98dde28de985]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.399 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.574 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.577 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.647 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.663 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.664 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.665 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.665 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 04:56:04 localhost nova_compute[281288]: 2026-02-20 09:56:04.682 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 04:56:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:06.019 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:56:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:06.019 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:56:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:56:06 localhost podman[316205]: 2026-02-20 09:56:06.142029532 +0000 UTC m=+0.058415033 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:06 localhost systemd[1]: tmp-crun.Oprtiq.mount: Deactivated successfully. Feb 20 04:56:06 localhost dnsmasq[316049]: exiting on receipt of SIGTERM Feb 20 04:56:06 localhost systemd[1]: libpod-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01.scope: Deactivated successfully. Feb 20 04:56:06 localhost podman[316217]: 2026-02-20 09:56:06.200824156 +0000 UTC m=+0.044619285 container died 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 20 04:56:06 localhost podman[316217]: 2026-02-20 09:56:06.251910966 +0000 UTC m=+0.095706075 container cleanup 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:06 localhost nova_compute[281288]: 2026-02-20 09:56:06.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:06 localhost systemd[1]: libpod-conmon-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01.scope: Deactivated successfully. Feb 20 04:56:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:06 localhost podman[316219]: 2026-02-20 09:56:06.305478581 +0000 UTC m=+0.140599926 container remove 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:56:06 localhost podman[316246]: 2026-02-20 09:56:06.406595129 +0000 UTC m=+0.086306730 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:56:06 localhost podman[316246]: 2026-02-20 09:56:06.474220721 +0000 UTC m=+0.153932302 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:56:06 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:56:06 localhost podman[316274]: 2026-02-20 09:56:06.557077325 +0000 UTC m=+0.085040562 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 20 04:56:06 localhost podman[316274]: 2026-02-20 09:56:06.561480918 +0000 UTC m=+0.089444005 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:56:06 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:56:06 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:06.578 264355 INFO neutron.agent.dhcp.agent [None req-14e7e21d-85db-4fcd-b44f-93fe00750199 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:06 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:06.578 264355 INFO neutron.agent.dhcp.agent [None req-14e7e21d-85db-4fcd-b44f-93fe00750199 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:07.114 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:07 localhost systemd[1]: tmp-crun.k88TM2.mount: Deactivated successfully. Feb 20 04:56:07 localhost systemd[1]: var-lib-containers-storage-overlay-2f8187cf9303c1f059ae001862a85201eb317f04d51767ad21701d4f209390a5-merged.mount: Deactivated successfully. Feb 20 04:56:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:07 localhost systemd[1]: run-netns-qdhcp\x2d477c60d5\x2d1ced\x2d40c8\x2db389\x2d807eea4d8a62.mount: Deactivated successfully. Feb 20 04:56:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:07 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/766486190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:07 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/766486190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:07.576 264355 INFO neutron.agent.linux.ip_lib [None req-18a669aa-7587-498f-825f-6b145f8a4193 - - - - - -] Device tapab7745cf-b0 cannot be used as it has no MAC address#033[00m Feb 20 04:56:07 localhost ovn_controller[156798]: 2026-02-20T09:56:07Z|00279|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.639 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost kernel: device tapab7745cf-b0 entered promiscuous mode Feb 20 04:56:07 localhost ovn_controller[156798]: 2026-02-20T09:56:07Z|00280|binding|INFO|Claiming lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 for this chassis. Feb 20 04:56:07 localhost ovn_controller[156798]: 2026-02-20T09:56:07Z|00281|binding|INFO|ab7745cf-b091-4696-a4c2-1807d5c5fc66: Claiming unknown Feb 20 04:56:07 localhost NetworkManager[5988]: [1771581367.6578] manager: (tapab7745cf-b0): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost systemd-udevd[316302]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:07.671 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b30598c7-87a8-481e-9e41-a7365e7f8781, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ab7745cf-b091-4696-a4c2-1807d5c5fc66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:07.673 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ab7745cf-b091-4696-a4c2-1807d5c5fc66 in datapath a882f5ec-b867-48ca-837a-0ff3e12032b5 bound to our chassis#033[00m Feb 20 04:56:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:07.676 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 66774e6a-d590-4cbf-92dc-54d8452fe968 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:56:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:07.676 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a882f5ec-b867-48ca-837a-0ff3e12032b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:07.677 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[437fc9ac-194d-4337-9a93-3a4d5c38b098]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.699 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost ovn_controller[156798]: 2026-02-20T09:56:07Z|00282|binding|INFO|Setting lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 ovn-installed in OVS Feb 20 04:56:07 localhost ovn_controller[156798]: 2026-02-20T09:56:07Z|00283|binding|INFO|Setting lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 up in Southbound Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.702 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost journal[229984]: ethtool ioctl error on tapab7745cf-b0: No such device Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost nova_compute[281288]: 2026-02-20 09:56:07.775 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:07 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:07.888 2 INFO neutron.agent.securitygroups_rpc [None req-8965acb2-2b16-4d89-a227-154eee5fe38f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:08.373 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:08.376 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:56:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:08.380 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:08 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:08.381 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3002a9e2-03c4-447f-b62c-7d0e1ae9cc6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:08 localhost podman[316373]: Feb 20 04:56:08 localhost podman[316373]: 2026-02-20 09:56:08.669111011 +0000 UTC m=+0.077698518 container create 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:56:08 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:08.679 2 INFO neutron.agent.securitygroups_rpc [None req-d51c801f-c66b-4697-8723-78081587d201 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:08 localhost systemd[1]: Started libpod-conmon-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2.scope. Feb 20 04:56:08 localhost podman[316373]: 2026-02-20 09:56:08.632271403 +0000 UTC m=+0.040858980 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:56:08 localhost systemd[1]: tmp-crun.g53Swh.mount: Deactivated successfully. Feb 20 04:56:08 localhost systemd[1]: Started libcrun container. Feb 20 04:56:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72d8d3ddae4f030671431c2b14aac6be84d7b25fbbaca5b27db499d762216d2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:56:08 localhost podman[316373]: 2026-02-20 09:56:08.763081052 +0000 UTC m=+0.171668589 container init 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 20 04:56:08 localhost podman[316373]: 2026-02-20 09:56:08.774116957 +0000 UTC m=+0.182704494 container start 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:08 localhost dnsmasq[316391]: started, version 2.85 cachesize 150 Feb 20 04:56:08 localhost dnsmasq[316391]: DNS service limited to local subnets Feb 20 04:56:08 localhost dnsmasq[316391]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:56:08 localhost dnsmasq[316391]: warning: no upstream servers configured Feb 20 04:56:08 localhost dnsmasq-dhcp[316391]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:56:08 localhost dnsmasq[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/addn_hosts - 0 addresses Feb 20 04:56:08 localhost dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/host Feb 20 04:56:08 localhost dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/opts Feb 20 04:56:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:08.832 264355 INFO neutron.agent.dhcp.agent [None req-954de5ce-1da6-4437-9c3c-d61063b0cd37 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:07Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=672227c5-4720-4d41-a0d6-6e9f5b528e92, ip_allocation=immediate, mac_address=fa:16:3e:0c:3e:77, name=tempest-PortsTestJSON-495831353, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:03Z, description=, dns_domain=, id=a882f5ec-b867-48ca-837a-0ff3e12032b5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-643978840, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57544, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2234, status=ACTIVE, subnets=['58e95fae-bb88-4e7c-92ac-e1d173306dd0'], tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:06Z, vlan_transparent=None, network_id=a882f5ec-b867-48ca-837a-0ff3e12032b5, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['72ed92b6-af24-4274-854b-a52220405faf'], standard_attr_id=2243, status=DOWN, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:07Z on network a882f5ec-b867-48ca-837a-0ff3e12032b5#033[00m Feb 20 04:56:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:08.984 264355 INFO neutron.agent.dhcp.agent [None req-9f19bcf9-e212-43ab-bd66-0cf57b476a5f - - - - - -] DHCP configuration for ports {'7c76b554-ab71-45df-b525-0c267fc92bd2'} is completed#033[00m Feb 20 04:56:09 localhost dnsmasq[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/addn_hosts - 1 addresses Feb 20 04:56:09 localhost dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/host Feb 20 04:56:09 localhost podman[316407]: 2026-02-20 09:56:09.04516752 +0000 UTC m=+0.047728059 container kill 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:56:09 localhost dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/opts Feb 20 04:56:09 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:09.298 264355 INFO neutron.agent.dhcp.agent [None req-31b7dcf9-ba3d-48ac-871d-490caaa16b0f - - - - - -] DHCP configuration for ports {'672227c5-4720-4d41-a0d6-6e9f5b528e92'} is completed#033[00m Feb 20 04:56:09 localhost dnsmasq[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/addn_hosts - 0 addresses Feb 20 04:56:09 localhost dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/host Feb 20 04:56:09 localhost podman[316445]: 2026-02-20 09:56:09.389473976 +0000 UTC m=+0.050081561 container kill 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:09 localhost dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/opts Feb 20 04:56:09 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:09.493 2 INFO neutron.agent.securitygroups_rpc [None req-166aeacb-5366-40db-a13d-35c7cc5a7a14 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:09 localhost nova_compute[281288]: 2026-02-20 09:56:09.576 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:09.802 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 66774e6a-d590-4cbf-92dc-54d8452fe968 with type ""#033[00m Feb 20 04:56:09 localhost ovn_controller[156798]: 2026-02-20T09:56:09Z|00284|binding|INFO|Removing iface tapab7745cf-b0 ovn-installed in OVS Feb 20 04:56:09 localhost ovn_controller[156798]: 2026-02-20T09:56:09Z|00285|binding|INFO|Removing lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 ovn-installed in OVS Feb 20 04:56:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:09.803 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b30598c7-87a8-481e-9e41-a7365e7f8781, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ab7745cf-b091-4696-a4c2-1807d5c5fc66) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:09 localhost nova_compute[281288]: 2026-02-20 09:56:09.803 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:09.806 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ab7745cf-b091-4696-a4c2-1807d5c5fc66 in datapath a882f5ec-b867-48ca-837a-0ff3e12032b5 unbound from our chassis#033[00m Feb 20 04:56:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:09.810 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a882f5ec-b867-48ca-837a-0ff3e12032b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:09 localhost nova_compute[281288]: 2026-02-20 09:56:09.811 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:09 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:09.811 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1d75db-2c04-4db3-b628-cec5d28f49ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:09 localhost dnsmasq[316391]: exiting on receipt of SIGTERM Feb 20 04:56:09 localhost podman[316484]: 2026-02-20 09:56:09.820440931 +0000 UTC m=+0.064051264 container kill 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:56:09 localhost systemd[1]: libpod-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2.scope: Deactivated successfully. Feb 20 04:56:09 localhost podman[316498]: 2026-02-20 09:56:09.894413685 +0000 UTC m=+0.057248168 container died 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:56:09 localhost podman[316498]: 2026-02-20 09:56:09.930660625 +0000 UTC m=+0.093495068 container cleanup 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:56:09 localhost systemd[1]: libpod-conmon-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2.scope: Deactivated successfully. Feb 20 04:56:09 localhost podman[316499]: 2026-02-20 09:56:09.972381741 +0000 UTC m=+0.130222462 container remove 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:56:09 localhost kernel: device tapab7745cf-b0 left promiscuous mode Feb 20 04:56:09 localhost nova_compute[281288]: 2026-02-20 09:56:09.985 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:09 localhost nova_compute[281288]: 2026-02-20 09:56:09.998 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:10 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:10.145 264355 INFO neutron.agent.dhcp.agent [None req-63c1a141-18cb-45d0-a643-47cade11de72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:10 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:10.151 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:10 localhost ovn_controller[156798]: 2026-02-20T09:56:10Z|00286|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:10 localhost nova_compute[281288]: 2026-02-20 09:56:10.446 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:10 localhost systemd[1]: var-lib-containers-storage-overlay-72d8d3ddae4f030671431c2b14aac6be84d7b25fbbaca5b27db499d762216d2d-merged.mount: Deactivated successfully. Feb 20 04:56:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:10 localhost systemd[1]: run-netns-qdhcp\x2da882f5ec\x2db867\x2d48ca\x2d837a\x2d0ff3e12032b5.mount: Deactivated successfully. Feb 20 04:56:10 localhost sshd[316526]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:56:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:11.006 2 INFO neutron.agent.securitygroups_rpc [None req-7e54c9a7-f5f5-46c1-ae1b-688f8acab697 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['46f15231-c0dd-46d4-9abc-adba5985e75b']#033[00m Feb 20 04:56:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:11 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:11.322 2 INFO neutron.agent.securitygroups_rpc [None req-4b1a20ca-0949-416f-91ae-525739a1e77a f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:13.737 2 INFO neutron.agent.securitygroups_rpc [None req-93b1773d-c2eb-4652-8e8d-0c460cd5364e 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['46f15231-c0dd-46d4-9abc-adba5985e75b', '446482cb-8c18-450e-acf7-2fbe583929b8']#033[00m Feb 20 04:56:13 localhost nova_compute[281288]: 2026-02-20 09:56:13.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:56:14 localhost podman[316528]: 2026-02-20 09:56:14.148098877 +0000 UTC m=+0.086117493 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:14 localhost podman[316528]: 2026-02-20 09:56:14.162107792 +0000 UTC m=+0.100126378 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 20 04:56:14 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:56:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:14.358 2 INFO neutron.agent.securitygroups_rpc [None req-6fbfa532-f4c6-42e9-b707-63e0a42ce0d3 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['446482cb-8c18-450e-acf7-2fbe583929b8']#033[00m Feb 20 04:56:14 localhost nova_compute[281288]: 2026-02-20 09:56:14.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:14 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:14.681 264355 INFO neutron.agent.linux.ip_lib [None req-a9d0852b-246f-45da-9c08-e40cbf2b895a - - - - - -] Device tap50f94ccb-90 cannot be used as it has no MAC address#033[00m Feb 20 04:56:14 localhost nova_compute[281288]: 2026-02-20 09:56:14.705 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:14 localhost kernel: device tap50f94ccb-90 entered promiscuous mode Feb 20 04:56:14 localhost NetworkManager[5988]: [1771581374.7146] manager: (tap50f94ccb-90): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Feb 20 04:56:14 localhost ovn_controller[156798]: 2026-02-20T09:56:14Z|00287|binding|INFO|Claiming lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f for this chassis. Feb 20 04:56:14 localhost nova_compute[281288]: 2026-02-20 09:56:14.715 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:14 localhost ovn_controller[156798]: 2026-02-20T09:56:14Z|00288|binding|INFO|50f94ccb-90f6-40a9-9b2a-8774575ebe1f: Claiming unknown Feb 20 04:56:14 localhost systemd-udevd[316558]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:56:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:14.729 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac9b59eb-ef08-4f48-908a-f0e44b4f5714, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=50f94ccb-90f6-40a9-9b2a-8774575ebe1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:14.731 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50f94ccb-90f6-40a9-9b2a-8774575ebe1f in datapath 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543 bound to our chassis#033[00m Feb 20 04:56:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:14.734 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:14.735 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1165e7-7bea-47e9-a471-043b8ba8051d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost ovn_controller[156798]: 2026-02-20T09:56:14Z|00289|binding|INFO|Setting lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f ovn-installed in OVS Feb 20 04:56:14 localhost ovn_controller[156798]: 2026-02-20T09:56:14Z|00290|binding|INFO|Setting lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f up in Southbound Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost nova_compute[281288]: 2026-02-20 09:56:14.749 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost journal[229984]: ethtool ioctl error on tap50f94ccb-90: No such device Feb 20 04:56:14 localhost nova_compute[281288]: 2026-02-20 09:56:14.784 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:14 localhost nova_compute[281288]: 2026-02-20 09:56:14.809 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:15 localhost dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 0 addresses Feb 20 04:56:15 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host Feb 20 04:56:15 localhost podman[316604]: 2026-02-20 09:56:15.019620998 +0000 UTC m=+0.053907986 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:56:15 localhost dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.058 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.060 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.064 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.065 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[82d8b059-092b-43da-986b-58155effc230]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:15 localhost sshd[316633]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:56:15 localhost kernel: device tap4a535ed2-00 left promiscuous mode Feb 20 04:56:15 localhost ovn_controller[156798]: 2026-02-20T09:56:15Z|00291|binding|INFO|Releasing lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 from this chassis (sb_readonly=0) Feb 20 04:56:15 localhost nova_compute[281288]: 2026-02-20 09:56:15.294 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:15 localhost ovn_controller[156798]: 2026-02-20T09:56:15Z|00292|binding|INFO|Setting lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 down in Southbound Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.304 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2f3d8db-7c26-4898-8e61-0b8ba04044df, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4a535ed2-00be-4ec7-8d9e-24afdab13877) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.306 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4a535ed2-00be-4ec7-8d9e-24afdab13877 in datapath cd45d753-df49-44f6-b419-6749d7fe84f3 unbound from our chassis#033[00m Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.308 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cd45d753-df49-44f6-b419-6749d7fe84f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:15 localhost nova_compute[281288]: 2026-02-20 09:56:15.308 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:15.310 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[299afe80-b78f-4512-819a-d21e52b6e073]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:15 localhost podman[316667]: Feb 20 04:56:15 localhost podman[316667]: 2026-02-20 09:56:15.743328365 +0000 UTC m=+0.094330873 container create 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:15 localhost systemd[1]: Started libpod-conmon-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49.scope. Feb 20 04:56:15 localhost podman[316667]: 2026-02-20 09:56:15.699605348 +0000 UTC m=+0.050607866 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:56:15 localhost systemd[1]: Started libcrun container. Feb 20 04:56:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efdc0a52815bc810e6855830fc6f3a2cfc45ff834f8d7edab4a6f93251dafdd5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:56:15 localhost podman[316667]: 2026-02-20 09:56:15.818777924 +0000 UTC m=+0.169780432 container init 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 04:56:15 localhost podman[316667]: 2026-02-20 09:56:15.827853079 +0000 UTC m=+0.178855587 container start 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:15 localhost dnsmasq[316685]: started, version 2.85 cachesize 150 Feb 20 04:56:15 localhost dnsmasq[316685]: DNS service limited to local subnets Feb 20 04:56:15 localhost dnsmasq[316685]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:56:15 localhost dnsmasq[316685]: warning: no upstream servers configured Feb 20 04:56:15 localhost dnsmasq-dhcp[316685]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:56:15 localhost dnsmasq[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/addn_hosts - 0 addresses Feb 20 04:56:15 localhost dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/host Feb 20 04:56:15 localhost dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/opts Feb 20 04:56:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:16.251 264355 INFO neutron.agent.dhcp.agent [None req-1df8cec4-0e90-4b2f-835e-a8212a043feb - - - - - -] DHCP configuration for ports {'544ef45a-4e36-47a7-a9cf-a372fb626d48'} is completed#033[00m Feb 20 04:56:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:16 localhost dnsmasq[315482]: exiting on receipt of SIGTERM Feb 20 04:56:16 localhost podman[316703]: 2026-02-20 09:56:16.577977658 +0000 UTC m=+0.058184397 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 20 04:56:16 localhost systemd[1]: libpod-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d.scope: Deactivated successfully. Feb 20 04:56:16 localhost podman[316715]: 2026-02-20 09:56:16.653292693 +0000 UTC m=+0.063003393 container died a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:56:16 localhost podman[316715]: 2026-02-20 09:56:16.689469661 +0000 UTC m=+0.099180311 container cleanup a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:56:16 localhost systemd[1]: libpod-conmon-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d.scope: Deactivated successfully. Feb 20 04:56:16 localhost podman[316717]: 2026-02-20 09:56:16.725479792 +0000 UTC m=+0.124305592 container remove a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 20 04:56:16 localhost systemd[1]: var-lib-containers-storage-overlay-8a8113fcfd1358b3db8a5596a585afb795b665e94caae01a8c63b1b24d0a203b-merged.mount: Deactivated successfully. Feb 20 04:56:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:16.864 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:16Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=34719fc8-3f38-4502-8ff2-ba5516ba7226, ip_allocation=immediate, mac_address=fa:16:3e:0a:ae:8f, name=tempest-PortsTestJSON-1975505507, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:10Z, description=, dns_domain=, id=6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1562581203, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45010, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2264, status=ACTIVE, subnets=['ca6109c7-e6d1-47cb-90b4-650cf67ce94e'], tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:13Z, vlan_transparent=None, network_id=6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2283, status=DOWN, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:16Z on network 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543#033[00m Feb 20 04:56:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:16.940 264355 INFO neutron.agent.dhcp.agent [None req-39e942ed-42d1-4cb9-be31-c6659c14a791 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:16 localhost systemd[1]: run-netns-qdhcp\x2dcd45d753\x2ddf49\x2d44f6\x2db419\x2d6749d7fe84f3.mount: Deactivated successfully. Feb 20 04:56:17 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:17.067 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:17 localhost dnsmasq[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/addn_hosts - 1 addresses Feb 20 04:56:17 localhost dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/host Feb 20 04:56:17 localhost dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/opts Feb 20 04:56:17 localhost podman[316762]: 2026-02-20 09:56:17.084627589 +0000 UTC m=+0.055210176 container kill 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:56:17 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:17.344 264355 INFO neutron.agent.dhcp.agent [None req-ca896c1b-0a7e-4005-af16-f5d61cdd11f5 - - - - - -] DHCP configuration for ports {'34719fc8-3f38-4502-8ff2-ba5516ba7226'} is completed#033[00m Feb 20 04:56:17 localhost podman[241968]: time="2026-02-20T09:56:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:56:17 localhost podman[241968]: @ - - [20/Feb/2026:09:56:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 20 04:56:17 localhost podman[241968]: @ - - [20/Feb/2026:09:56:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19280 "" "Go-http-client/1.1" Feb 20 04:56:17 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:17.986 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:18 localhost dnsmasq[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/addn_hosts - 0 addresses Feb 20 04:56:18 localhost dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/host Feb 20 04:56:18 localhost dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/opts Feb 20 04:56:18 localhost podman[316800]: 2026-02-20 09:56:18.372083279 +0000 UTC m=+0.056992250 container kill 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:56:18 localhost ovn_controller[156798]: 2026-02-20T09:56:18Z|00293|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:18 localhost nova_compute[281288]: 2026-02-20 09:56:18.558 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:18 localhost dnsmasq[316685]: exiting on receipt of SIGTERM Feb 20 04:56:18 localhost podman[316837]: 2026-02-20 09:56:18.800895448 +0000 UTC m=+0.063623540 container kill 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:56:18 localhost systemd[1]: libpod-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49.scope: Deactivated successfully. Feb 20 04:56:18 localhost podman[316849]: 2026-02-20 09:56:18.870793599 +0000 UTC m=+0.057619548 container died 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:56:18 localhost podman[316849]: 2026-02-20 09:56:18.901166001 +0000 UTC m=+0.087991880 container cleanup 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:18 localhost systemd[1]: libpod-conmon-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49.scope: Deactivated successfully. Feb 20 04:56:18 localhost podman[316851]: 2026-02-20 09:56:18.942499655 +0000 UTC m=+0.121179317 container remove 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:56:18 localhost ovn_controller[156798]: 2026-02-20T09:56:18Z|00294|binding|INFO|Releasing lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f from this chassis (sb_readonly=0) Feb 20 04:56:18 localhost ovn_controller[156798]: 2026-02-20T09:56:18Z|00295|binding|INFO|Setting lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f down in Southbound Feb 20 04:56:18 localhost kernel: device tap50f94ccb-90 left promiscuous mode Feb 20 04:56:18 localhost nova_compute[281288]: 2026-02-20 09:56:18.994 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.005 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac9b59eb-ef08-4f48-908a-f0e44b4f5714, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=50f94ccb-90f6-40a9-9b2a-8774575ebe1f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.007 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50f94ccb-90f6-40a9-9b2a-8774575ebe1f in datapath 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543 unbound from our chassis#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.011 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.012 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[815c0f17-2257-4968-beb9-7f2b6f683c80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:19 localhost nova_compute[281288]: 2026-02-20 09:56:19.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:19 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:19.154 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.241 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.242 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.246 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:19.247 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab88c3d-4696-4b0c-8c28-ce22290974ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:19 localhost systemd[1]: var-lib-containers-storage-overlay-efdc0a52815bc810e6855830fc6f3a2cfc45ff834f8d7edab4a6f93251dafdd5-merged.mount: Deactivated successfully. Feb 20 04:56:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:19 localhost systemd[1]: run-netns-qdhcp\x2d6a1f8436\x2dee67\x2d4aeb\x2d90fc\x2dcbe6c39f2543.mount: Deactivated successfully. Feb 20 04:56:19 localhost nova_compute[281288]: 2026-02-20 09:56:19.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:19 localhost nova_compute[281288]: 2026-02-20 09:56:19.585 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:20 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:20.084 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:20.346 2 INFO neutron.agent.securitygroups_rpc [None req-27e863e6-abb7-4d79-8929-35ee419d3ab5 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:20 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/772936112' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:20 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/772936112' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:20 localhost ovn_controller[156798]: 2026-02-20T09:56:20Z|00296|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:20 localhost nova_compute[281288]: 2026-02-20 09:56:20.623 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:20.840 2 INFO neutron.agent.securitygroups_rpc [None req-5bc16860-c455-4be4-9017-f7ba050a5b1d f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:20.942 2 INFO neutron.agent.securitygroups_rpc [None req-3d50957a-c50d-404e-a697-bd588426aa5b 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['9c894fef-e625-4d2d-ad79-9f0215b19661']#033[00m Feb 20 04:56:21 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:21.128 2 INFO neutron.agent.securitygroups_rpc [None req-ab2c767f-db90-4059-9416-3c9c50626a18 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:21 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:21.818 2 INFO neutron.agent.securitygroups_rpc [None req-325d197d-f2bb-472d-a6df-be02729b4a1c 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:22 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:22.603 2 INFO neutron.agent.securitygroups_rpc [None req-521cfad5-05c2-4b59-9313-296ec36811c0 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:23 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:23.111 2 INFO neutron.agent.securitygroups_rpc [None req-264095f7-8549-4a1d-9c14-cf140323ad0c 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:23 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:23.452 2 INFO neutron.agent.securitygroups_rpc [None req-8e5e38ae-f36c-4a7a-929b-4d665cde8908 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:56:24 localhost systemd[1]: tmp-crun.pm9isZ.mount: Deactivated successfully. Feb 20 04:56:24 localhost podman[316881]: 2026-02-20 09:56:24.153963345 +0000 UTC m=+0.091159966 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:56:24 localhost podman[316881]: 2026-02-20 09:56:24.161658529 +0000 UTC m=+0.098855110 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:56:24 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:56:24 localhost nova_compute[281288]: 2026-02-20 09:56:24.585 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:24 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:24.666 2 INFO neutron.agent.securitygroups_rpc [None req-7c70538f-1d84-485c-beb6-53999b2ce1d2 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['9c894fef-e625-4d2d-ad79-9f0215b19661', '6e36724b-9ab8-4bfe-9f74-069d82055697', '5fe0aa03-55bd-43ef-a38b-499c4a5e8b30']#033[00m Feb 20 04:56:25 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:25.110 2 INFO neutron.agent.securitygroups_rpc [None req-22f50294-5f51-4ab3-8b7c-31c2f02c0d3d 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:25 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:25.783 2 INFO neutron.agent.securitygroups_rpc [None req-f3d891d9-b12f-41a6-9c43-2a59a14444d4 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['6e36724b-9ab8-4bfe-9f74-069d82055697', '5fe0aa03-55bd-43ef-a38b-499c4a5e8b30']#033[00m Feb 20 04:56:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:26 localhost openstack_network_exporter[244414]: ERROR 09:56:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:56:26 localhost openstack_network_exporter[244414]: Feb 20 04:56:26 localhost openstack_network_exporter[244414]: ERROR 09:56:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:56:26 localhost openstack_network_exporter[244414]: Feb 20 04:56:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:26.619 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:26.621 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:56:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:26.624 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:26 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:26.625 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[341212f9-c3b3-43e6-bae3-e5d39182f7bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:27 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:27.553 2 INFO neutron.agent.securitygroups_rpc [None req-a6f56626-9080-4b48-8909-d5cbdaffd977 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:27 localhost ovn_controller[156798]: 2026-02-20T09:56:27Z|00297|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:27 localhost nova_compute[281288]: 2026-02-20 09:56:27.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:56:28 localhost podman[316904]: 2026-02-20 09:56:28.144809333 +0000 UTC m=+0.084493375 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:56:28 localhost podman[316904]: 2026-02-20 09:56:28.156034353 +0000 UTC m=+0.095718405 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:56:28 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:56:28 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:28.580 2 INFO neutron.agent.securitygroups_rpc [None req-6a8272e4-f5a1-42d2-a801-cea63c76a8af f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:28 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:28.890 2 INFO neutron.agent.securitygroups_rpc [None req-222fdb52-3334-45ab-8f45-945b32b8d031 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']#033[00m Feb 20 04:56:29 localhost nova_compute[281288]: 2026-02-20 09:56:29.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:31 localhost ovn_controller[156798]: 2026-02-20T09:56:31Z|00298|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:56:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:31 localhost nova_compute[281288]: 2026-02-20 09:56:31.699 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:31.703 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:83:b0 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b4d5592-ecf2-48cc-b3b1-c6ba46f9e5e6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dee4bf28-462f-4e5a-bb37-08fba06228d7) old=Port_Binding(mac=['fa:16:3e:ce:83:b0 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:31.704 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dee4bf28-462f-4e5a-bb37-08fba06228d7 in datapath 34dc61c2-2cd5-48a1-a54d-350e15f73770 updated#033[00m Feb 20 04:56:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:31.707 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34dc61c2-2cd5-48a1-a54d-350e15f73770, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:31 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:31.707 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[71260dfc-c0dc-4d76-a89c-4aca0b779f89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:31 localhost podman[316928]: 2026-02-20 09:56:31.71633593 +0000 UTC m=+0.449854640 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9/ubi-minimal, release=1770267347, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 20 04:56:31 localhost podman[316928]: 2026-02-20 09:56:31.750084414 +0000 UTC m=+0.483603114 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 20 04:56:32 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:32.273 2 INFO neutron.agent.securitygroups_rpc [None req-c16a47d9-8c3c-4273-8f35-4d2edcf8a46b f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:32 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:56:32 localhost dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 0 addresses Feb 20 04:56:32 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host Feb 20 04:56:32 localhost dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts Feb 20 04:56:32 localhost podman[316965]: 2026-02-20 09:56:32.366268538 +0000 UTC m=+0.037554731 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 20 04:56:32 localhost ovn_controller[156798]: 2026-02-20T09:56:32Z|00299|binding|INFO|Releasing lport 36eed41e-8c87-4e39-8638-1f088c4d480e from this chassis (sb_readonly=0) Feb 20 04:56:32 localhost kernel: device tap36eed41e-8c left promiscuous mode Feb 20 04:56:32 localhost ovn_controller[156798]: 2026-02-20T09:56:32Z|00300|binding|INFO|Setting lport 36eed41e-8c87-4e39-8638-1f088c4d480e down in Southbound Feb 20 04:56:32 localhost nova_compute[281288]: 2026-02-20 09:56:32.513 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:32.521 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24080264-67e8-4d16-abbb-0767714bc8ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=36eed41e-8c87-4e39-8638-1f088c4d480e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:32.522 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 36eed41e-8c87-4e39-8638-1f088c4d480e in datapath 7d64da0e-050b-4b53-8861-874f3c3ef083 unbound from our chassis#033[00m Feb 20 04:56:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:32.525 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d64da0e-050b-4b53-8861-874f3c3ef083, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:32.526 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5cf083-242b-4b41-8303-1e074300a68e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:32 localhost nova_compute[281288]: 2026-02-20 09:56:32.535 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:33.105 2 INFO neutron.agent.securitygroups_rpc [None req-b75820d6-6baf-4494-b7b1-8acd63dcbbd9 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:33.139 2 INFO neutron.agent.securitygroups_rpc [None req-8fddf0ed-4d67-47fd-a98a-ec6a15c12895 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:33 localhost dnsmasq[315780]: exiting on receipt of SIGTERM Feb 20 04:56:33 localhost podman[317006]: 2026-02-20 09:56:33.234800619 +0000 UTC m=+0.057914349 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:33 localhost systemd[1]: libpod-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8.scope: Deactivated successfully. Feb 20 04:56:33 localhost podman[317018]: 2026-02-20 09:56:33.284087244 +0000 UTC m=+0.039141639 container died 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:33 localhost podman[317018]: 2026-02-20 09:56:33.314835426 +0000 UTC m=+0.069889791 container cleanup 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:56:33 localhost systemd[1]: libpod-conmon-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8.scope: Deactivated successfully. Feb 20 04:56:33 localhost podman[317020]: 2026-02-20 09:56:33.392679348 +0000 UTC m=+0.138353138 container remove 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:33 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:33.459 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:33 localhost nova_compute[281288]: 2026-02-20 09:56:33.761 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:33 localhost nova_compute[281288]: 2026-02-20 09:56:33.789 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 20 04:56:33 localhost nova_compute[281288]: 2026-02-20 09:56:33.790 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:56:33 localhost nova_compute[281288]: 2026-02-20 09:56:33.790 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:56:33 localhost nova_compute[281288]: 2026-02-20 09:56:33.814 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:56:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e153 e153: 6 total, 6 up, 6 in Feb 20 04:56:33 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:33.858 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:34 localhost ovn_controller[156798]: 2026-02-20T09:56:34Z|00301|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:34 localhost nova_compute[281288]: 2026-02-20 09:56:34.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:34 localhost systemd[1]: var-lib-containers-storage-overlay-3e2bd54cd7424e1b7846b66bd6e6847e221583891328681be5cb3316924e217c-merged.mount: Deactivated successfully. Feb 20 04:56:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:34 localhost systemd[1]: run-netns-qdhcp\x2d7d64da0e\x2d050b\x2d4b53\x2d8861\x2d874f3c3ef083.mount: Deactivated successfully. Feb 20 04:56:34 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:34.439 2 INFO neutron.agent.securitygroups_rpc [None req-ceb6fbf0-e236-46d5-ab31-4b9208acd398 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:34 localhost nova_compute[281288]: 2026-02-20 09:56:34.592 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:34 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e154 e154: 6 total, 6 up, 6 in Feb 20 04:56:35 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:35.737 2 INFO neutron.agent.securitygroups_rpc [None req-21b5cfcb-ef7f-4dc6-82f5-46fe7ab7fc9a 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e155 e155: 6 total, 6 up, 6 in Feb 20 04:56:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:36 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:36.670 2 INFO neutron.agent.securitygroups_rpc [None req-5f75f844-b31b-4010-9a93-efcf0b2c4eb8 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:36 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:36.894 2 INFO neutron.agent.securitygroups_rpc [None req-a5e98a26-c124-4fdc-9abc-b12558eae8ef f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e156 e156: 6 total, 6 up, 6 in Feb 20 04:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:56:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:37.037 264355 INFO neutron.agent.linux.ip_lib [None req-3a002da4-7f7b-4ddd-b6b6-5656271a924c - - - - - -] Device tapc5a7b9ef-7d cannot be used as it has no MAC address#033[00m Feb 20 04:56:37 localhost podman[317049]: 2026-02-20 09:56:37.062873769 +0000 UTC m=+0.099844341 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:56:37 localhost nova_compute[281288]: 2026-02-20 09:56:37.066 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:37 localhost kernel: device tapc5a7b9ef-7d entered promiscuous mode Feb 20 04:56:37 localhost NetworkManager[5988]: [1771581397.0755] manager: (tapc5a7b9ef-7d): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Feb 20 04:56:37 localhost nova_compute[281288]: 2026-02-20 09:56:37.076 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:37 localhost ovn_controller[156798]: 2026-02-20T09:56:37Z|00302|binding|INFO|Claiming lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce for this chassis. Feb 20 04:56:37 localhost ovn_controller[156798]: 2026-02-20T09:56:37Z|00303|binding|INFO|c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce: Claiming unknown Feb 20 04:56:37 localhost systemd-udevd[317082]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:56:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:37.093 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8616b81b-c719-43b5-be2c-dbf68397c33b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:37.097 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce in datapath 91a3c914-50af-4619-8f46-93ff66e8b045 bound to our chassis#033[00m Feb 20 04:56:37 localhost podman[317049]: 2026-02-20 09:56:37.097889961 +0000 UTC m=+0.134860533 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent) Feb 20 04:56:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:37.099 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 91a3c914-50af-4619-8f46-93ff66e8b045 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:37.100 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[eba41a9e-94bf-4600-a19f-433c96983724]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:37 localhost ovn_controller[156798]: 2026-02-20T09:56:37Z|00304|binding|INFO|Setting lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce ovn-installed in OVS Feb 20 04:56:37 localhost ovn_controller[156798]: 2026-02-20T09:56:37Z|00305|binding|INFO|Setting lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce up in Southbound Feb 20 04:56:37 localhost nova_compute[281288]: 2026-02-20 09:56:37.121 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:37 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:56:37 localhost podman[317048]: 2026-02-20 09:56:37.130221352 +0000 UTC m=+0.174850516 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:56:37 localhost nova_compute[281288]: 2026-02-20 09:56:37.153 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:37 localhost podman[317048]: 2026-02-20 09:56:37.161390757 +0000 UTC m=+0.206019871 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:56:37 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:56:37 localhost nova_compute[281288]: 2026-02-20 09:56:37.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:37 localhost systemd[1]: tmp-crun.4OAisw.mount: Deactivated successfully. Feb 20 04:56:37 localhost podman[317153]: Feb 20 04:56:37 localhost podman[317153]: 2026-02-20 09:56:37.989732129 +0000 UTC m=+0.088056612 container create 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 04:56:38 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:38.020 2 INFO neutron.agent.securitygroups_rpc [None req-a827ab5a-214a-4a1d-a84d-cac050b991d6 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:38 localhost systemd[1]: Started libpod-conmon-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6.scope. Feb 20 04:56:38 localhost podman[317153]: 2026-02-20 09:56:37.947404845 +0000 UTC m=+0.045729328 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:56:38 localhost systemd[1]: Started libcrun container. Feb 20 04:56:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd26698bf47eaf748148ca73e0ba4205722c28c51e908d2f97ffc72f55d6d626/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:56:38 localhost podman[317153]: 2026-02-20 09:56:38.06625417 +0000 UTC m=+0.164578653 container init 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 20 04:56:38 localhost podman[317153]: 2026-02-20 09:56:38.072133488 +0000 UTC m=+0.170457971 container start 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:56:38 localhost dnsmasq[317172]: started, version 2.85 cachesize 150 Feb 20 04:56:38 localhost dnsmasq[317172]: DNS service limited to local subnets Feb 20 04:56:38 localhost dnsmasq[317172]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:56:38 localhost dnsmasq[317172]: warning: no upstream servers configured Feb 20 04:56:38 localhost dnsmasq-dhcp[317172]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:56:38 localhost dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 0 addresses Feb 20 04:56:38 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host Feb 20 04:56:38 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts Feb 20 04:56:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:38.219 264355 INFO neutron.agent.dhcp.agent [None req-7b3d08e0-3d5a-4399-9db6-3150dc7d1478 - - - - - -] DHCP configuration for ports {'fa0abef2-8ae8-40ee-a86b-a1b7596c8d71'} is completed#033[00m Feb 20 04:56:38 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:38 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2694845083' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:38 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:38 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2694845083' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:38 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e157 e157: 6 total, 6 up, 6 in Feb 20 04:56:39 localhost nova_compute[281288]: 2026-02-20 09:56:39.620 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:56:39 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2206 writes, 22K keys, 2206 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s#012Cumulative WAL: 2206 writes, 2206 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2206 writes, 22K keys, 2206 commit groups, 1.0 writes per commit group, ingest: 39.05 MB, 0.07 MB/s#012Interval WAL: 2206 writes, 2206 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 183.9 0.13 0.06 7 0.019 0 0 0.0 0.0#012 L6 1/0 15.18 MB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 4.1 223.9 204.4 0.49 0.28 6 0.082 74K 2880 0.0 0.0#012 Sum 1/0 15.18 MB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 5.1 175.7 200.0 0.63 0.34 13 0.048 74K 2880 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 5.1 176.3 200.6 0.62 0.34 12 0.052 74K 2880 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 0.0 223.9 204.4 0.49 0.28 6 0.082 74K 2880 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 186.7 0.13 0.06 6 0.022 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.024, interval 0.024#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.12 GB write, 0.21 MB/s write, 0.11 GB read, 0.18 MB/s read, 0.6 seconds#012Interval compaction: 0.12 GB write, 0.21 MB/s write, 0.11 GB read, 0.18 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559a1eac51f0#2 capacity: 308.00 MB usage: 11.04 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000111 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(564,10.51 MB,3.41339%) FilterBlock(13,234.23 KB,0.0742677%) IndexBlock(13,300.55 KB,0.0952931%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 20 04:56:39 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e158 e158: 6 total, 6 up, 6 in Feb 20 04:56:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:41 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:41.714 2 INFO neutron.agent.securitygroups_rpc [None req-13f83c28-0ec5-483d-8133-f11a853f0aba f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:41.983 264355 INFO neutron.agent.linux.ip_lib [None req-23b58b6a-bbd0-4fc6-bbb9-bac7e4963005 - - - - - -] Device tape4daede4-dd cannot be used as it has no MAC address#033[00m Feb 20 04:56:42 localhost nova_compute[281288]: 2026-02-20 09:56:42.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:42 localhost kernel: device tape4daede4-dd entered promiscuous mode Feb 20 04:56:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:56:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:56:42 localhost NetworkManager[5988]: [1771581402.0248] manager: (tape4daede4-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Feb 20 04:56:42 localhost nova_compute[281288]: 2026-02-20 09:56:42.024 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:42 localhost ovn_controller[156798]: 2026-02-20T09:56:42Z|00306|binding|INFO|Claiming lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a for this chassis. Feb 20 04:56:42 localhost ovn_controller[156798]: 2026-02-20T09:56:42Z|00307|binding|INFO|e4daede4-dda0-4eeb-801e-ab2b266b4f0a: Claiming unknown Feb 20 04:56:42 localhost systemd-udevd[317268]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:56:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:42.039 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aac0e58-10ca-4f2e-89c4-11cc34f042d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e4daede4-dda0-4eeb-801e-ab2b266b4f0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:42.041 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e4daede4-dda0-4eeb-801e-ab2b266b4f0a in datapath 183b90c2-0ae0-467a-8a71-cbddda06cd4d bound to our chassis#033[00m Feb 20 04:56:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:42.043 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 183b90c2-0ae0-467a-8a71-cbddda06cd4d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:42 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:42.043 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc5127f-a2e8-4b2a-a121-102087a9e801]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost ovn_controller[156798]: 2026-02-20T09:56:42Z|00308|binding|INFO|Setting lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a ovn-installed in OVS Feb 20 04:56:42 localhost ovn_controller[156798]: 2026-02-20T09:56:42Z|00309|binding|INFO|Setting lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a up in Southbound Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost nova_compute[281288]: 2026-02-20 09:56:42.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost journal[229984]: ethtool ioctl error on tape4daede4-dd: No such device Feb 20 04:56:42 localhost nova_compute[281288]: 2026-02-20 09:56:42.101 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:42 localhost nova_compute[281288]: 2026-02-20 09:56:42.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:42 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:42.613 2 INFO neutron.agent.securitygroups_rpc [None req-92a7b9d6-6b07-465f-9755-118a416fc381 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:42 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e159 e159: 6 total, 6 up, 6 in Feb 20 04:56:42 localhost podman[317339]: Feb 20 04:56:42 localhost podman[317339]: 2026-02-20 09:56:42.956236915 +0000 UTC m=+0.093607232 container create 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:56:42 localhost systemd[1]: Started libpod-conmon-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358.scope. Feb 20 04:56:43 localhost systemd[1]: Started libcrun container. Feb 20 04:56:43 localhost podman[317339]: 2026-02-20 09:56:42.912806327 +0000 UTC m=+0.050176694 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:56:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83bff4229e001c3857e1bb62e3030a2db0bdff95dede477c807197129286ef30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:56:43 localhost podman[317339]: 2026-02-20 09:56:43.024143485 +0000 UTC m=+0.161513802 container init 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Feb 20 04:56:43 localhost podman[317339]: 2026-02-20 09:56:43.037471779 +0000 UTC m=+0.174842086 container start 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:43 localhost dnsmasq[317358]: started, version 2.85 cachesize 150 Feb 20 04:56:43 localhost dnsmasq[317358]: DNS service limited to local subnets Feb 20 04:56:43 localhost dnsmasq[317358]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:56:43 localhost dnsmasq[317358]: warning: no upstream servers configured Feb 20 04:56:43 localhost dnsmasq-dhcp[317358]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 20 04:56:43 localhost dnsmasq[317358]: read /var/lib/neutron/dhcp/183b90c2-0ae0-467a-8a71-cbddda06cd4d/addn_hosts - 0 addresses Feb 20 04:56:43 localhost dnsmasq-dhcp[317358]: read /var/lib/neutron/dhcp/183b90c2-0ae0-467a-8a71-cbddda06cd4d/host Feb 20 04:56:43 localhost dnsmasq-dhcp[317358]: read /var/lib/neutron/dhcp/183b90c2-0ae0-467a-8a71-cbddda06cd4d/opts Feb 20 04:56:43 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:43.185 264355 INFO neutron.agent.dhcp.agent [None req-eae2d8f0-ed48-45d9-8e7b-4c2051db44a7 - - - - - -] DHCP configuration for ports {'cdd8a163-912b-4c7f-8799-245420141a50'} is completed#033[00m Feb 20 04:56:44 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:44.137 2 INFO neutron.agent.securitygroups_rpc [None req-ec1ba1f0-724c-41a9-85b6-8188470faaf7 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:44 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:56:44 localhost nova_compute[281288]: 2026-02-20 09:56:44.653 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:44 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:44.883 2 INFO neutron.agent.securitygroups_rpc [None req-868e4387-1930-45d7-9199-5bcd1f2558e0 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:56:45 localhost podman[317359]: 2026-02-20 09:56:45.159412665 +0000 UTC m=+0.090490606 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 04:56:45 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:45.170 2 INFO neutron.agent.securitygroups_rpc [None req-d92a2777-d32e-4211-954b-8d8918f6f596 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:45 localhost podman[317359]: 2026-02-20 09:56:45.172847783 +0000 UTC m=+0.103925754 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:45 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:56:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:46 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:46.478 2 INFO neutron.agent.securitygroups_rpc [None req-d647a860-3cfb-47b1-bd0e-3817969b125e f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:46 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:46.998 2 INFO neutron.agent.securitygroups_rpc [None req-5e0be3b9-12ef-421f-8325-abea826190b6 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:47 localhost sshd[317378]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:56:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:47 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4040843744' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:47 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4040843744' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:47 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:47.572 2 INFO neutron.agent.securitygroups_rpc [None req-c2e08264-5759-4dbe-9f11-1020e63a5df8 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:47 localhost podman[241968]: time="2026-02-20T09:56:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:56:47 localhost podman[241968]: @ - - [20/Feb/2026:09:56:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158891 "" "Go-http-client/1.1" Feb 20 04:56:47 localhost podman[241968]: @ - - [20/Feb/2026:09:56:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19287 "" "Go-http-client/1.1" Feb 20 04:56:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e160 e160: 6 total, 6 up, 6 in Feb 20 04:56:48 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:48 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/10291474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:48 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:48 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/10291474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:49 localhost nova_compute[281288]: 2026-02-20 09:56:49.655 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:56:49 localhost nova_compute[281288]: 2026-02-20 09:56:49.661 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:49.728 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:49Z, description=, device_id=1d3f0830-6050-4ebf-baaf-b9d8b4a1ed67, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2ba0f911-9cd1-4881-a954-6bc0829ed2ed, ip_allocation=immediate, mac_address=fa:16:3e:c2:1e:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:34Z, description=, dns_domain=, id=91a3c914-50af-4619-8f46-93ff66e8b045, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1956959081, port_security_enabled=True, project_id=62f842a102bd4d84b1f4d275ec6dbea2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2340, status=ACTIVE, subnets=['eb9e42c2-7344-4772-9e65-da3068c65904'], tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:35Z, vlan_transparent=None, network_id=91a3c914-50af-4619-8f46-93ff66e8b045, port_security_enabled=False, project_id=62f842a102bd4d84b1f4d275ec6dbea2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2392, status=DOWN, tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:49Z on network 91a3c914-50af-4619-8f46-93ff66e8b045#033[00m Feb 20 04:56:49 localhost dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 1 addresses Feb 20 04:56:49 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host Feb 20 04:56:49 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts Feb 20 04:56:49 localhost podman[317397]: 2026-02-20 09:56:49.944442899 +0000 UTC m=+0.066911751 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:56:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:50.158 264355 INFO neutron.agent.dhcp.agent [None req-f02725fb-8091-4836-abfa-579e6ca5e183 - - - - - -] DHCP configuration for ports {'2ba0f911-9cd1-4881-a954-6bc0829ed2ed'} is completed#033[00m Feb 20 04:56:50 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:50.722 2 INFO neutron.agent.securitygroups_rpc [None req-6b7b9439-974f-45e5-a614-5a8be0850c72 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:51.034 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:49Z, description=, device_id=1d3f0830-6050-4ebf-baaf-b9d8b4a1ed67, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2ba0f911-9cd1-4881-a954-6bc0829ed2ed, ip_allocation=immediate, mac_address=fa:16:3e:c2:1e:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:34Z, description=, dns_domain=, id=91a3c914-50af-4619-8f46-93ff66e8b045, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1956959081, port_security_enabled=True, project_id=62f842a102bd4d84b1f4d275ec6dbea2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2340, status=ACTIVE, subnets=['eb9e42c2-7344-4772-9e65-da3068c65904'], tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:35Z, vlan_transparent=None, network_id=91a3c914-50af-4619-8f46-93ff66e8b045, port_security_enabled=False, project_id=62f842a102bd4d84b1f4d275ec6dbea2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2392, status=DOWN, tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:49Z on network 91a3c914-50af-4619-8f46-93ff66e8b045#033[00m Feb 20 04:56:51 localhost dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 1 addresses Feb 20 04:56:51 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host Feb 20 04:56:51 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts Feb 20 04:56:51 localhost podman[317435]: 2026-02-20 09:56:51.233162098 +0000 UTC m=+0.063684052 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:51 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:51.307 2 INFO neutron.agent.securitygroups_rpc [None req-780282fe-fede-41a8-980f-26511e126244 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:51.500 264355 INFO neutron.agent.dhcp.agent [None req-124ef94c-65bc-4ecf-a0fc-1a6c75e8ae82 - - - - - -] DHCP configuration for ports {'2ba0f911-9cd1-4881-a954-6bc0829ed2ed'} is completed#033[00m Feb 20 04:56:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:51.547 264355 INFO neutron.agent.linux.ip_lib [None req-e7815bc1-0006-47f0-bd78-5a13b3bcf1e5 - - - - - -] Device tapc3226333-1e cannot be used as it has no MAC address#033[00m Feb 20 04:56:51 localhost nova_compute[281288]: 2026-02-20 09:56:51.568 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:51 localhost kernel: device tapc3226333-1e entered promiscuous mode Feb 20 04:56:51 localhost ovn_controller[156798]: 2026-02-20T09:56:51Z|00310|binding|INFO|Claiming lport c3226333-1e26-4492-9c87-0c0e84626249 for this chassis. Feb 20 04:56:51 localhost NetworkManager[5988]: [1771581411.5768] manager: (tapc3226333-1e): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Feb 20 04:56:51 localhost ovn_controller[156798]: 2026-02-20T09:56:51Z|00311|binding|INFO|c3226333-1e26-4492-9c87-0c0e84626249: Claiming unknown Feb 20 04:56:51 localhost nova_compute[281288]: 2026-02-20 09:56:51.578 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:51 localhost systemd-udevd[317466]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:56:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:51.587 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfec81f2-09be-417a-b2dd-d08c91c1c606, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3226333-1e26-4492-9c87-0c0e84626249) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:51.590 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c3226333-1e26-4492-9c87-0c0e84626249 in datapath 623d6533-b3ff-446c-abe7-d04d15b2cb53 bound to our chassis#033[00m Feb 20 04:56:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:51.592 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 623d6533-b3ff-446c-abe7-d04d15b2cb53 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:51.594 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d35a9f0a-a1b4-427c-a133-faca84670ace]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost nova_compute[281288]: 2026-02-20 09:56:51.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost ovn_controller[156798]: 2026-02-20T09:56:51Z|00312|binding|INFO|Setting lport c3226333-1e26-4492-9c87-0c0e84626249 ovn-installed in OVS Feb 20 04:56:51 localhost ovn_controller[156798]: 2026-02-20T09:56:51Z|00313|binding|INFO|Setting lport c3226333-1e26-4492-9c87-0c0e84626249 up in Southbound Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost nova_compute[281288]: 2026-02-20 09:56:51.615 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost journal[229984]: ethtool ioctl error on tapc3226333-1e: No such device Feb 20 04:56:51 localhost nova_compute[281288]: 2026-02-20 09:56:51.649 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:51 localhost nova_compute[281288]: 2026-02-20 09:56:51.676 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e161 e161: 6 total, 6 up, 6 in Feb 20 04:56:52 localhost dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 0 addresses Feb 20 04:56:52 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host Feb 20 04:56:52 localhost dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts Feb 20 04:56:52 localhost podman[317528]: 2026-02-20 09:56:52.227549437 +0000 UTC m=+0.063335772 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:56:52 localhost ovn_controller[156798]: 2026-02-20T09:56:52Z|00314|binding|INFO|Releasing lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce from this chassis (sb_readonly=0) Feb 20 04:56:52 localhost ovn_controller[156798]: 2026-02-20T09:56:52Z|00315|binding|INFO|Setting lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce down in Southbound Feb 20 04:56:52 localhost nova_compute[281288]: 2026-02-20 09:56:52.369 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:52 localhost kernel: device tapc5a7b9ef-7d left promiscuous mode Feb 20 04:56:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:52.377 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8616b81b-c719-43b5-be2c-dbf68397c33b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:52.379 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce in datapath 91a3c914-50af-4619-8f46-93ff66e8b045 unbound from our chassis#033[00m Feb 20 04:56:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:52.382 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 91a3c914-50af-4619-8f46-93ff66e8b045 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:52 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:52.382 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[12814e8a-cbce-4efa-9ce4-668fdd8a0ac3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:52 localhost nova_compute[281288]: 2026-02-20 09:56:52.386 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:52 localhost podman[317575]: Feb 20 04:56:52 localhost podman[317575]: 2026-02-20 09:56:52.619098567 +0000 UTC m=+0.093329823 container create 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:56:52 localhost systemd[1]: Started libpod-conmon-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196.scope. Feb 20 04:56:52 localhost systemd[1]: Started libcrun container. Feb 20 04:56:52 localhost podman[317575]: 2026-02-20 09:56:52.578134054 +0000 UTC m=+0.052365330 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:56:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/757a2b6f79ab43144d1e25f65f21cd12cf34ce98c88daf04e540abccae414274/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:56:52 localhost podman[317575]: 2026-02-20 09:56:52.694114083 +0000 UTC m=+0.168345349 container init 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:56:52 localhost podman[317575]: 2026-02-20 09:56:52.703812456 +0000 UTC m=+0.178043722 container start 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:56:52 localhost dnsmasq[317594]: started, version 2.85 cachesize 150 Feb 20 04:56:52 localhost dnsmasq[317594]: DNS service limited to local subnets Feb 20 04:56:52 localhost dnsmasq[317594]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:56:52 localhost dnsmasq[317594]: warning: no upstream servers configured Feb 20 04:56:52 localhost dnsmasq-dhcp[317594]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:56:52 localhost dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 0 addresses Feb 20 04:56:52 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host Feb 20 04:56:52 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts Feb 20 04:56:52 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:52.873 264355 INFO neutron.agent.dhcp.agent [None req-a2c35da7-7920-406e-afba-a2c8c673ae73 - - - - - -] DHCP configuration for ports {'435cdbe8-09f2-4cba-8ebb-dfddbdfafca4'} is completed#033[00m Feb 20 04:56:53 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:53.862 2 INFO neutron.agent.securitygroups_rpc [None req-db944330-da94-4d69-be05-7c7a8491a44e 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:53.924 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:53Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4fdf0c37-625e-4dd0-9c79-90b44fc71337, ip_allocation=immediate, mac_address=fa:16:3e:40:b5:20, name=tempest-PortsTestJSON-1423588587, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:49Z, description=, dns_domain=, id=623d6533-b3ff-446c-abe7-d04d15b2cb53, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1760858149, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65333, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2389, status=ACTIVE, subnets=['0edddd91-f1e3-43bf-aadf-6a42a8b0af87'], tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:50Z, vlan_transparent=None, network_id=623d6533-b3ff-446c-abe7-d04d15b2cb53, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['72ed92b6-af24-4274-854b-a52220405faf'], standard_attr_id=2400, status=DOWN, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:53Z on network 623d6533-b3ff-446c-abe7-d04d15b2cb53#033[00m Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.932144) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581413932251, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2683, "num_deletes": 265, "total_data_size": 5071826, "memory_usage": 5129296, "flush_reason": "Manual Compaction"} Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581413947334, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 3309686, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20100, "largest_seqno": 22778, "table_properties": {"data_size": 3299048, "index_size": 6823, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 23780, "raw_average_key_size": 22, "raw_value_size": 3277368, "raw_average_value_size": 3043, "num_data_blocks": 288, "num_entries": 1077, "num_filter_entries": 1077, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581267, "oldest_key_time": 1771581267, "file_creation_time": 1771581413, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 15234 microseconds, and 8046 cpu microseconds. Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.947388) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 3309686 bytes OK Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.947422) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.949333) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.949358) EVENT_LOG_v1 {"time_micros": 1771581413949351, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.949381) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 5059584, prev total WAL file size 5059584, number of live WAL files 2. Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.950617) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(3232KB)], [30(15MB)] Feb 20 04:56:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581413950693, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 19223658, "oldest_snapshot_seqno": -1} Feb 20 04:56:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2065491436' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:53 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2065491436' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12731 keys, 17974446 bytes, temperature: kUnknown Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581414031074, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17974446, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17900785, "index_size": 40728, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31877, "raw_key_size": 340497, "raw_average_key_size": 26, "raw_value_size": 17683119, "raw_average_value_size": 1388, "num_data_blocks": 1552, "num_entries": 12731, "num_filter_entries": 12731, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581413, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.031738) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17974446 bytes Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.033543) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.4 rd, 222.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 15.2 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(11.2) write-amplify(5.4) OK, records in: 13279, records dropped: 548 output_compression: NoCompression Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.033584) EVENT_LOG_v1 {"time_micros": 1771581414033563, "job": 16, "event": "compaction_finished", "compaction_time_micros": 80640, "compaction_time_cpu_micros": 46683, "output_level": 6, "num_output_files": 1, "total_output_size": 17974446, "num_input_records": 13279, "num_output_records": 12731, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581414034413, "job": 16, "event": "table_file_deletion", "file_number": 32} Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581414037855, "job": 16, "event": "table_file_deletion", "file_number": 30} Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.950554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:56:54 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:56:54 localhost dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 1 addresses Feb 20 04:56:54 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host Feb 20 04:56:54 localhost podman[317626]: 2026-02-20 09:56:54.216672445 +0000 UTC m=+0.051658068 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:54 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts Feb 20 04:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:56:54 localhost systemd[1]: tmp-crun.GoAI4T.mount: Deactivated successfully. Feb 20 04:56:54 localhost sshd[317652]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:56:54 localhost dnsmasq[317358]: exiting on receipt of SIGTERM Feb 20 04:56:54 localhost podman[317640]: 2026-02-20 09:56:54.281821402 +0000 UTC m=+0.082857765 container kill 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:56:54 localhost systemd[1]: libpod-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358.scope: Deactivated successfully. Feb 20 04:56:54 localhost podman[317650]: 2026-02-20 09:56:54.325100435 +0000 UTC m=+0.083020930 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:56:54 localhost podman[317667]: 2026-02-20 09:56:54.342046929 +0000 UTC m=+0.049842173 container died 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 04:56:54 localhost podman[317650]: 2026-02-20 09:56:54.364993415 +0000 UTC m=+0.122913910 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:56:54 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:56:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.416 264355 INFO neutron.agent.dhcp.agent [None req-720253c6-c001-445f-9718-c34db24c811b - - - - - -] DHCP configuration for ports {'4fdf0c37-625e-4dd0-9c79-90b44fc71337'} is completed#033[00m Feb 20 04:56:54 localhost podman[317667]: 2026-02-20 09:56:54.419549911 +0000 UTC m=+0.127345145 container cleanup 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 20 04:56:54 localhost systemd[1]: libpod-conmon-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358.scope: Deactivated successfully. Feb 20 04:56:54 localhost podman[317674]: 2026-02-20 09:56:54.443211868 +0000 UTC m=+0.128653484 container remove 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:54 localhost ovn_controller[156798]: 2026-02-20T09:56:54Z|00316|binding|INFO|Releasing lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a from this chassis (sb_readonly=0) Feb 20 04:56:54 localhost kernel: device tape4daede4-dd left promiscuous mode Feb 20 04:56:54 localhost nova_compute[281288]: 2026-02-20 09:56:54.494 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:54 localhost ovn_controller[156798]: 2026-02-20T09:56:54Z|00317|binding|INFO|Setting lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a down in Southbound Feb 20 04:56:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:54.505 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aac0e58-10ca-4f2e-89c4-11cc34f042d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e4daede4-dda0-4eeb-801e-ab2b266b4f0a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:54.507 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e4daede4-dda0-4eeb-801e-ab2b266b4f0a in datapath 183b90c2-0ae0-467a-8a71-cbddda06cd4d unbound from our chassis#033[00m Feb 20 04:56:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:54.509 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 183b90c2-0ae0-467a-8a71-cbddda06cd4d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:56:54 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:54.510 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[80d90cd9-8f17-4af2-94b9-c5989b332cdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:54 localhost nova_compute[281288]: 2026-02-20 09:56:54.517 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:54 localhost systemd[1]: var-lib-containers-storage-overlay-83bff4229e001c3857e1bb62e3030a2db0bdff95dede477c807197129286ef30-merged.mount: Deactivated successfully. Feb 20 04:56:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.650 264355 INFO neutron.agent.dhcp.agent [None req-c51a5fd7-7025-4892-82c1-b63078fab0b3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.650 264355 INFO neutron.agent.dhcp.agent [None req-c51a5fd7-7025-4892-82c1-b63078fab0b3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:54 localhost systemd[1]: run-netns-qdhcp\x2d183b90c2\x2d0ae0\x2d467a\x2d8a71\x2dcbddda06cd4d.mount: Deactivated successfully. Feb 20 04:56:54 localhost nova_compute[281288]: 2026-02-20 09:56:54.658 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:54 localhost nova_compute[281288]: 2026-02-20 09:56:54.662 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:54 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:54.771 2 INFO neutron.agent.securitygroups_rpc [None req-2eacea7e-ffc4-4411-bf47-38f06768c1ff f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:54 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.897 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:54 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e162 e162: 6 total, 6 up, 6 in Feb 20 04:56:55 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:55.097 2 INFO neutron.agent.securitygroups_rpc [None req-fc9a7b92-030e-427f-b2ca-50d327cf6718 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:55 localhost ovn_controller[156798]: 2026-02-20T09:56:55Z|00318|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:56:55 localhost nova_compute[281288]: 2026-02-20 09:56:55.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e163 e163: 6 total, 6 up, 6 in Feb 20 04:56:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:56:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:56.345 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:53Z, description=, device_id=ac59bc6c-a14a-43db-804a-739accb55b95, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4fdf0c37-625e-4dd0-9c79-90b44fc71337, ip_allocation=immediate, mac_address=fa:16:3e:40:b5:20, name=tempest-PortsTestJSON-1423588587, network_id=623d6533-b3ff-446c-abe7-d04d15b2cb53, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['72ed92b6-af24-4274-854b-a52220405faf'], standard_attr_id=2400, status=ACTIVE, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:54Z on network 623d6533-b3ff-446c-abe7-d04d15b2cb53#033[00m Feb 20 04:56:56 localhost dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 1 addresses Feb 20 04:56:56 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host Feb 20 04:56:56 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts Feb 20 04:56:56 localhost podman[317730]: 2026-02-20 09:56:56.551608215 +0000 UTC m=+0.048152322 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:56 localhost openstack_network_exporter[244414]: ERROR 09:56:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:56:56 localhost openstack_network_exporter[244414]: Feb 20 04:56:56 localhost openstack_network_exporter[244414]: ERROR 09:56:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:56:56 localhost openstack_network_exporter[244414]: Feb 20 04:56:56 localhost systemd[1]: tmp-crun.lk9UaJ.mount: Deactivated successfully. Feb 20 04:56:56 localhost dnsmasq[317172]: exiting on receipt of SIGTERM Feb 20 04:56:56 localhost podman[317766]: 2026-02-20 09:56:56.722405437 +0000 UTC m=+0.052339230 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:56 localhost systemd[1]: libpod-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6.scope: Deactivated successfully. Feb 20 04:56:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:56.779 264355 INFO neutron.agent.dhcp.agent [None req-f70db66b-a68a-4f5e-b535-053047f6e0e8 - - - - - -] DHCP configuration for ports {'4fdf0c37-625e-4dd0-9c79-90b44fc71337'} is completed#033[00m Feb 20 04:56:56 localhost podman[317783]: 2026-02-20 09:56:56.785792939 +0000 UTC m=+0.048666987 container died 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:56 localhost podman[317783]: 2026-02-20 09:56:56.825530506 +0000 UTC m=+0.088404484 container remove 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:56:56 localhost systemd[1]: libpod-conmon-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6.scope: Deactivated successfully. Feb 20 04:56:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:56.870 264355 INFO neutron.agent.dhcp.agent [None req-dab06233-d07f-4325-9d26-29b01790e6ee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e164 e164: 6 total, 6 up, 6 in Feb 20 04:56:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:57.204 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.224 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.225 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.226 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.268 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8:0:1:f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.269 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.298 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:57.299 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[49153975-1579-4d97-aad6-b9c45ebd4a2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:57 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:57.478 2 INFO neutron.agent.securitygroups_rpc [None req-8da4750a-6dfd-49c6-9c80-6371938bf016 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay-fd26698bf47eaf748148ca73e0ba4205722c28c51e908d2f97ffc72f55d6d626-merged.mount: Deactivated successfully. Feb 20 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6-userdata-shm.mount: Deactivated successfully. Feb 20 04:56:57 localhost systemd[1]: run-netns-qdhcp\x2d91a3c914\x2d50af\x2d4619\x2d8f46\x2d93ff66e8b045.mount: Deactivated successfully. Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:56:57 localhost systemd[1]: tmp-crun.jdWdUe.mount: Deactivated successfully. Feb 20 04:56:57 localhost dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 0 addresses Feb 20 04:56:57 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host Feb 20 04:56:57 localhost dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts Feb 20 04:56:57 localhost podman[317824]: 2026-02-20 09:56:57.738459623 +0000 UTC m=+0.071288914 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.760 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:56:57 localhost nova_compute[281288]: 2026-02-20 09:56:57.760 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:56:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e165 e165: 6 total, 6 up, 6 in Feb 20 04:56:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:56:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2324158721' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:56:58 localhost kernel: device tapc3226333-1e left promiscuous mode Feb 20 04:56:58 localhost ovn_controller[156798]: 2026-02-20T09:56:58Z|00319|binding|INFO|Releasing lport c3226333-1e26-4492-9c87-0c0e84626249 from this chassis (sb_readonly=0) Feb 20 04:56:58 localhost ovn_controller[156798]: 2026-02-20T09:56:58Z|00320|binding|INFO|Setting lport c3226333-1e26-4492-9c87-0c0e84626249 down in Southbound Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.176 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:58.182 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfec81f2-09be-417a-b2dd-d08c91c1c606, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3226333-1e26-4492-9c87-0c0e84626249) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:56:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:58.183 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c3226333-1e26-4492-9c87-0c0e84626249 in datapath 623d6533-b3ff-446c-abe7-d04d15b2cb53 unbound from our chassis#033[00m Feb 20 04:56:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:58.185 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 623d6533-b3ff-446c-abe7-d04d15b2cb53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:56:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:56:58.185 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[46277c38-0280-4e50-bd2c-e6a3b79ff1a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.195 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.199 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:58 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:58.253 2 INFO neutron.agent.securitygroups_rpc [None req-23120728-b18f-4316-9b23-0bff49b361e7 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.267 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.267 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.487 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.488 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11304MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.489 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.489 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.801 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.802 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.802 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:56:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:56:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2859662458' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:56:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:56:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2859662458' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.855 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.924 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.925 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.954 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 04:56:58 localhost nova_compute[281288]: 2026-02-20 09:56:58.980 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.019 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:56:59 localhost neutron_sriov_agent[257177]: 2026-02-20 09:56:59.039 2 INFO neutron.agent.securitygroups_rpc [None req-8dad7750-521e-4bcf-b6a0-3ff2b8fade37 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:56:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:56:59 localhost systemd[1]: tmp-crun.MzvD2H.mount: Deactivated successfully. Feb 20 04:56:59 localhost podman[317871]: 2026-02-20 09:56:59.154877505 +0000 UTC m=+0.092451196 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:56:59 localhost podman[317871]: 2026-02-20 09:56:59.166200119 +0000 UTC m=+0.103773860 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:56:59 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:56:59 localhost dnsmasq[317594]: exiting on receipt of SIGTERM Feb 20 04:56:59 localhost podman[317931]: 2026-02-20 09:56:59.32347224 +0000 UTC m=+0.057506165 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:56:59 localhost systemd[1]: libpod-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196.scope: Deactivated successfully. Feb 20 04:56:59 localhost podman[317951]: 2026-02-20 09:56:59.407943144 +0000 UTC m=+0.057516157 container died 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:56:59 localhost podman[317951]: 2026-02-20 09:56:59.455190656 +0000 UTC m=+0.104763599 container remove 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 04:56:59 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:56:59 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1602955077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.482 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.488 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:56:59 localhost systemd[1]: libpod-conmon-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196.scope: Deactivated successfully. Feb 20 04:56:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:59.501 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.507 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.510 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.511 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:56:59 localhost nova_compute[281288]: 2026-02-20 09:56:59.661 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:56:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:56:59.800 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:00 localhost ovn_controller[156798]: 2026-02-20T09:57:00Z|00321|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:57:00 localhost nova_compute[281288]: 2026-02-20 09:57:00.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:00 localhost systemd[1]: var-lib-containers-storage-overlay-757a2b6f79ab43144d1e25f65f21cd12cf34ce98c88daf04e540abccae414274-merged.mount: Deactivated successfully. Feb 20 04:57:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196-userdata-shm.mount: Deactivated successfully. Feb 20 04:57:00 localhost systemd[1]: run-netns-qdhcp\x2d623d6533\x2db3ff\x2d446c\x2dabe7\x2dd04d15b2cb53.mount: Deactivated successfully. Feb 20 04:57:00 localhost nova_compute[281288]: 2026-02-20 09:57:00.511 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:00 localhost nova_compute[281288]: 2026-02-20 09:57:00.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:00 localhost nova_compute[281288]: 2026-02-20 09:57:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:00 localhost nova_compute[281288]: 2026-02-20 09:57:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:57:01 localhost nova_compute[281288]: 2026-02-20 09:57:01.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:02 localhost nova_compute[281288]: 2026-02-20 09:57:02.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:02 localhost nova_compute[281288]: 2026-02-20 09:57:02.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:57:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:57:02 localhost podman[317973]: 2026-02-20 09:57:02.832410588 +0000 UTC m=+0.084315769 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:57:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e166 e166: 6 total, 6 up, 6 in Feb 20 04:57:02 localhost podman[317973]: 2026-02-20 09:57:02.871162474 +0000 UTC m=+0.123067655 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc.) Feb 20 04:57:02 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:57:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:03.034 2 INFO neutron.agent.securitygroups_rpc [None req-f7f4080a-576f-4ec0-afc4-2b369a4e24bc 90a02ec8973644daaf9f628e26b82aba 68587c4c15964f28ad6d155288e119b0 - - default default] Security group rule updated ['602964d2-c9d4-4795-879d-2f4697b07a9a']#033[00m Feb 20 04:57:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:03.608 2 INFO neutron.agent.securitygroups_rpc [None req-7415ebd2-08fb-4812-9524-708fe60e5aaa 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['efc53d5c-88f6-4ec9-8815-9d765811b12e']#033[00m Feb 20 04:57:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e167 e167: 6 total, 6 up, 6 in Feb 20 04:57:03 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:03.990 2 INFO neutron.agent.securitygroups_rpc [None req-f1e881f7-c612-45f2-b27b-1f5d8fe2e21f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:57:04 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:04.436 2 INFO neutron.agent.securitygroups_rpc [None req-0e7411d4-9e8e-44df-aee4-9bd8dc94f75f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']#033[00m Feb 20 04:57:04 localhost nova_compute[281288]: 2026-02-20 09:57:04.664 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:05.184 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec4f10d3-21c0-4f32-97fb-37b6b004b601, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3568ed7b-9263-43af-b4fd-ae333afc9a3b) old=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:05.186 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3568ed7b-9263-43af-b4fd-ae333afc9a3b in datapath 039b20b8-16a8-495e-968a-63fcd66a566c updated#033[00m Feb 20 04:57:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:05.189 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 039b20b8-16a8-495e-968a-63fcd66a566c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:57:05 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:05.191 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[67ad4991-64b9-4f7e-b1b1-880ef2e7b029]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:05 localhost nova_compute[281288]: 2026-02-20 09:57:05.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:05 localhost nova_compute[281288]: 2026-02-20 09:57:05.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:57:05 localhost nova_compute[281288]: 2026-02-20 09:57:05.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:57:05 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:05.910 2 INFO neutron.agent.securitygroups_rpc [None req-e5b5b6c2-ec9f-4e00-8939-9097e06787f1 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['c1686fb3-a7b5-4191-9c5e-7c249c4e6c3c', 'efc53d5c-88f6-4ec9-8815-9d765811b12e']#033[00m Feb 20 04:57:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:57:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:57:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.082 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.082 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.082 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.083 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:57:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:57:06 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:06.382 2 INFO neutron.agent.securitygroups_rpc [None req-15e35e99-a929-4578-aaaf-c6b96452307f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['c1686fb3-a7b5-4191-9c5e-7c249c4e6c3c']#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.859 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.877 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:57:06 localhost nova_compute[281288]: 2026-02-20 09:57:06.878 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:57:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e168 e168: 6 total, 6 up, 6 in Feb 20 04:57:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:57:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:57:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e169 e169: 6 total, 6 up, 6 in Feb 20 04:57:08 localhost systemd[1]: tmp-crun.kZmkt0.mount: Deactivated successfully. Feb 20 04:57:08 localhost podman[317994]: 2026-02-20 09:57:08.157208685 +0000 UTC m=+0.089365444 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 20 04:57:08 localhost podman[317994]: 2026-02-20 09:57:08.166148537 +0000 UTC m=+0.098305346 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:57:08 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:57:08 localhost podman[317993]: 2026-02-20 09:57:08.249728414 +0000 UTC m=+0.185834894 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller) Feb 20 04:57:08 localhost podman[317993]: 2026-02-20 09:57:08.292024833 +0000 UTC m=+0.228131283 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:57:08 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:57:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:57:08 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1936621399' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:57:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:57:08 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1936621399' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:57:09 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:09.630 2 INFO neutron.agent.securitygroups_rpc [None req-d187057a-39e5-4c52-82a0-d1bcafd46a90 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['46d5d21d-63a5-4d3d-a013-7b21b89cdba7']#033[00m Feb 20 04:57:09 localhost nova_compute[281288]: 2026-02-20 09:57:09.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:10.952 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec4f10d3-21c0-4f32-97fb-37b6b004b601, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3568ed7b-9263-43af-b4fd-ae333afc9a3b) old=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:10.954 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3568ed7b-9263-43af-b4fd-ae333afc9a3b in datapath 039b20b8-16a8-495e-968a-63fcd66a566c updated#033[00m Feb 20 04:57:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:10.956 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 039b20b8-16a8-495e-968a-63fcd66a566c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:57:10 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:10.957 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fccf8251-7c70-4a6a-9c24-314ae1f8fca7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:11 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Feb 20 04:57:11 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:57:11 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:57:11 localhost sshd[318034]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:57:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:57:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:12.020 2 INFO neutron.agent.securitygroups_rpc [None req-6ce6cc4b-c215-41b7-8af5-e062eb4d8872 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['b7f2b362-1261-45d0-afca-4d7d4dc43da1', '66935af6-6884-4649-9f3d-6c32279f86ee', '46d5d21d-63a5-4d3d-a013-7b21b89cdba7']#033[00m Feb 20 04:57:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e170 e170: 6 total, 6 up, 6 in Feb 20 04:57:12 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:12.591 2 INFO neutron.agent.securitygroups_rpc [None req-9d686e86-35e8-431c-8fc4-b6265d5fa0d0 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['b7f2b362-1261-45d0-afca-4d7d4dc43da1', '66935af6-6884-4649-9f3d-6c32279f86ee']#033[00m Feb 20 04:57:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e171 e171: 6 total, 6 up, 6 in Feb 20 04:57:14 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:14.014 264355 INFO neutron.agent.linux.ip_lib [None req-ec7441c0-46d9-4736-8999-f022ef24d103 - - - - - -] Device tapc16513af-fd cannot be used as it has no MAC address#033[00m Feb 20 04:57:14 localhost nova_compute[281288]: 2026-02-20 09:57:14.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:14 localhost kernel: device tapc16513af-fd entered promiscuous mode Feb 20 04:57:14 localhost NetworkManager[5988]: [1771581434.0552] manager: (tapc16513af-fd): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Feb 20 04:57:14 localhost ovn_controller[156798]: 2026-02-20T09:57:14Z|00322|binding|INFO|Claiming lport c16513af-fdad-437e-89f6-e90f98f0836a for this chassis. Feb 20 04:57:14 localhost ovn_controller[156798]: 2026-02-20T09:57:14Z|00323|binding|INFO|c16513af-fdad-437e-89f6-e90f98f0836a: Claiming unknown Feb 20 04:57:14 localhost nova_compute[281288]: 2026-02-20 09:57:14.056 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:14 localhost systemd-udevd[318046]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:57:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:14.067 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a08202c1391432d972dc0430612e0e0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=edefd936-7bd3-45c5-ab80-3ad63680dbf7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c16513af-fdad-437e-89f6-e90f98f0836a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:14.069 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c16513af-fdad-437e-89f6-e90f98f0836a in datapath 4eca95b1-f334-4c45-8797-de13f5964062 bound to our chassis#033[00m Feb 20 04:57:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:14.071 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4eca95b1-f334-4c45-8797-de13f5964062 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:57:14 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:14.073 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[08d5c43f-19fa-4206-bda7-b2b2e81587ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost nova_compute[281288]: 2026-02-20 09:57:14.099 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:14 localhost ovn_controller[156798]: 2026-02-20T09:57:14Z|00324|binding|INFO|Setting lport c16513af-fdad-437e-89f6-e90f98f0836a ovn-installed in OVS Feb 20 04:57:14 localhost ovn_controller[156798]: 2026-02-20T09:57:14Z|00325|binding|INFO|Setting lport c16513af-fdad-437e-89f6-e90f98f0836a up in Southbound Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost journal[229984]: ethtool ioctl error on tapc16513af-fd: No such device Feb 20 04:57:14 localhost nova_compute[281288]: 2026-02-20 09:57:14.140 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:14 localhost nova_compute[281288]: 2026-02-20 09:57:14.172 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:14 localhost nova_compute[281288]: 2026-02-20 09:57:14.697 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:15.039 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:6d:09'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '9', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec4f10d3-21c0-4f32-97fb-37b6b004b601, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3568ed7b-9263-43af-b4fd-ae333afc9a3b) old=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2 10.100.0.34'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:15.041 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3568ed7b-9263-43af-b4fd-ae333afc9a3b in datapath 039b20b8-16a8-495e-968a-63fcd66a566c updated#033[00m Feb 20 04:57:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:15.043 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 039b20b8-16a8-495e-968a-63fcd66a566c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:57:15 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:15.044 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdb8ee4-3f8d-4c1f-8d59-d4df9f2b96f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:15 localhost podman[318117]: Feb 20 04:57:15 localhost podman[318117]: 2026-02-20 09:57:15.119038039 +0000 UTC m=+0.095462980 container create edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:57:15 localhost systemd[1]: Started libpod-conmon-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38.scope. Feb 20 04:57:15 localhost podman[318117]: 2026-02-20 09:57:15.072510901 +0000 UTC m=+0.048935862 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:57:15 localhost systemd[1]: Started libcrun container. Feb 20 04:57:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81691ba7110d0b98058372d9f0498590f655b1c412c70b155fbfe7659ea93fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:57:15 localhost podman[318117]: 2026-02-20 09:57:15.20475425 +0000 UTC m=+0.181179181 container init edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 04:57:15 localhost podman[318117]: 2026-02-20 09:57:15.213728234 +0000 UTC m=+0.190153175 container start edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 20 04:57:15 localhost dnsmasq[318135]: started, version 2.85 cachesize 150 Feb 20 04:57:15 localhost dnsmasq[318135]: DNS service limited to local subnets Feb 20 04:57:15 localhost dnsmasq[318135]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:57:15 localhost dnsmasq[318135]: warning: no upstream servers configured Feb 20 04:57:15 localhost dnsmasq-dhcp[318135]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:57:15 localhost dnsmasq[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/addn_hosts - 0 addresses Feb 20 04:57:15 localhost dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/host Feb 20 04:57:15 localhost dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/opts Feb 20 04:57:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:57:15 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Feb 20 04:57:15 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:57:15 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:57:15 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:15.320 264355 INFO neutron.agent.dhcp.agent [None req-ba488952-ff62-4881-bb5d-d7825b9a9db9 - - - - - -] DHCP configuration for ports {'62c9e6ac-55ec-4e9e-80f1-f21e18b1f81c'} is completed#033[00m Feb 20 04:57:15 localhost podman[318136]: 2026-02-20 09:57:15.327324004 +0000 UTC m=+0.080631307 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:57:15 localhost podman[318136]: 2026-02-20 09:57:15.342157026 +0000 UTC m=+0.095464339 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 20 04:57:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e172 e172: 6 total, 6 up, 6 in Feb 20 04:57:15 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:57:15 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:15.819 2 INFO neutron.agent.securitygroups_rpc [None req-e7815d36-a39d-42c8-a497-7fe4eae772f9 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']#033[00m Feb 20 04:57:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:57:16 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2050439858' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:57:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:57:16 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2050439858' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:57:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:57:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:16.327 2 INFO neutron.agent.securitygroups_rpc [None req-3bc90bc6-4752-435d-941c-f0e75fc5d0a5 e8d99e5aba074cfb8aea01d99045d2af 8a08202c1391432d972dc0430612e0e0 - - default default] Security group member updated ['49b521a4-2cce-4f1a-b690-2fa2cab68db5']#033[00m Feb 20 04:57:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e173 e173: 6 total, 6 up, 6 in Feb 20 04:57:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:16.421 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:57:15Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=715989dc-23f1-48d7-a326-2c1d3c713a9e, ip_allocation=immediate, mac_address=fa:16:3e:57:89:2e, name=tempest-TagsExtTest-1349879559, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:57:11Z, description=, dns_domain=, id=4eca95b1-f334-4c45-8797-de13f5964062, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-293434349, port_security_enabled=True, project_id=8a08202c1391432d972dc0430612e0e0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44681, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2521, status=ACTIVE, subnets=['6a34847d-7d7a-4411-adb9-dcdb762b6f01'], tags=[], tenant_id=8a08202c1391432d972dc0430612e0e0, updated_at=2026-02-20T09:57:12Z, vlan_transparent=None, network_id=4eca95b1-f334-4c45-8797-de13f5964062, port_security_enabled=True, project_id=8a08202c1391432d972dc0430612e0e0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['49b521a4-2cce-4f1a-b690-2fa2cab68db5'], standard_attr_id=2531, status=DOWN, tags=[], tenant_id=8a08202c1391432d972dc0430612e0e0, updated_at=2026-02-20T09:57:16Z on network 4eca95b1-f334-4c45-8797-de13f5964062#033[00m Feb 20 04:57:16 localhost dnsmasq[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/addn_hosts - 1 addresses Feb 20 04:57:16 localhost dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/host Feb 20 04:57:16 localhost dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/opts Feb 20 04:57:16 localhost podman[318173]: 2026-02-20 09:57:16.655941904 +0000 UTC m=+0.063880737 container kill edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:57:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:16.879 264355 INFO neutron.agent.dhcp.agent [None req-2f030488-9419-4142-9d82-aa29e86c9d1f - - - - - -] DHCP configuration for ports {'715989dc-23f1-48d7-a326-2c1d3c713a9e'} is completed#033[00m Feb 20 04:57:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:16.927 2 INFO neutron.agent.securitygroups_rpc [None req-4f303a3e-0093-4654-a2f8-5b0853d1acad 3fd5694d6e624148892ddc3041d2f0e1 4bc7f22347de4004b73776eab4064bd0 - - default default] Security group member updated ['c599d16d-0283-4cf2-8a39-4a506ff8f2f0']#033[00m Feb 20 04:57:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e174 e174: 6 total, 6 up, 6 in Feb 20 04:57:17 localhost podman[241968]: time="2026-02-20T09:57:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:57:17 localhost podman[241968]: @ - - [20/Feb/2026:09:57:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157080 "" "Go-http-client/1.1" Feb 20 04:57:17 localhost podman[241968]: @ - - [20/Feb/2026:09:57:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18799 "" "Go-http-client/1.1" Feb 20 04:57:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e175 e175: 6 total, 6 up, 6 in Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.212 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.217 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e756da8-e38f-41d4-afaf-6555f506717a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.213784', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8aa95d4e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '3198116d2a162b8d7dd27a189c499fb47eb9e6ae2aba2e646575bf2143179831'}]}, 'timestamp': '2026-02-20 09:57:18.218455', '_unique_id': 'a7cf4ec84cf648db81177fc94225ae5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.221 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b426184-2a33-45f1-8b73-fcee18e2e14e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.221113', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8aa9da26-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '8d36cf3b1308f127882c99a0733f296c4033d6c6da915a6c7171dedb75a00cba'}]}, 'timestamp': '2026-02-20 09:57:18.221606', '_unique_id': '03af605a448a433386990b3f67d8d984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.236 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.237 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '120a3d9f-7682-4e73-bfa7-6bca5f9db7c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.223739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8aac4946-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '4ae99fb3fc3b199bff3fb41c20b0e9214e9403fa019a75f2e6dcd4eb13b0a35b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.223739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8aac5c38-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '6d3f11bf1b2e75adfeb04269de86d84869af846bfb294a0ff30741757bf4075d'}]}, 'timestamp': '2026-02-20 09:57:18.238015', '_unique_id': 'ae0d1377ae2c48cf97a9f4450b68c1f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69d03f3-89f3-4130-aaf0-7d596339ecdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.240431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab10f08-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '2e375e242239e23160c258ca57b1c4211ad180ccd7d01d28933cbae61cc8c2d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.240431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab12344-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '997906ac3c9ae840d6f7b097ba56d9c46457bbdbe8d9de0e6c2eff9a70ffa12e'}]}, 'timestamp': '2026-02-20 09:57:18.269357', '_unique_id': '775bf6d473fd47eeb78c0c13a90d70d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.272 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70e869ed-943b-4fc8-83c9-97d2615f1c2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.272094', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab1a0a8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': 'ce9e75112d93eeab878ab3d20b7f99418350273bb4660634be488f5322938fab'}]}, 'timestamp': '2026-02-20 09:57:18.272559', '_unique_id': 'e612cd33550a437c98bf6f33cdfe9c04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.274 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2da5cf08-afaf-45f9-a534-c84af187f4bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.274904', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab20e76-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '7208da92c6d716ab8e99d530601dd23713a1049a145d86601cbd0a9532f63460'}]}, 'timestamp': '2026-02-20 09:57:18.275367', '_unique_id': '7336f7fc9cf64e969ebd51b4fb10fb11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.277 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.294 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eefd2161-099f-4731-bca9-be7711621fc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:57:18.277760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8ab51e4a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.5337437, 'message_signature': '670be395e11a0481ec84551029771770be49dea8fe6bbe044d8f9326c87d7450'}]}, 'timestamp': '2026-02-20 09:57:18.295496', '_unique_id': '12931a25edb54679bbc099565cf36dc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.298 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c16d816-82b9-4d03-9fcc-374c0ddef9a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.298585', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab5ac8e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '33555731d3e6f864879e9a9ca00f498949eda5d7abcb517e37ffc6a10bc152b4'}]}, 'timestamp': '2026-02-20 09:57:18.299080', '_unique_id': '0f84d22a45a64319aae948f1e7bd56af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b462a559-7319-4d33-8fd3-0ac3c5b76c7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.301310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab61a84-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '914049685523df244357b1d81d1449103771ba8549bc71f225033c5243db9eb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.301310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab630dc-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '1c58bcf3299561c960cfc6523aba9d88826d54a7b58fd0ab94e48bef2f1c747a'}]}, 'timestamp': '2026-02-20 09:57:18.302508', '_unique_id': '8c0baed8dcc74b53afc6f29c9381e341'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd1734e8-776f-4936-9ec1-630bdf71b543', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.304895', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab6a18e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': 'c153345ec49f4099d89ea38462f832c8be029ca2e1aba4b662ebb091fa362ea7'}]}, 'timestamp': '2026-02-20 09:57:18.305345', '_unique_id': '5aa7989da31a4a61b0ab45f7490ba0fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9211cdbd-268a-4fe4-a674-6aa7dbcf60e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.307593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab70ef8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '12a4077fa20b987e9bc09f608f83aceb499bf07fdfb9ba502791d2284f55427f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.307593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab71eca-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '058bf5b0ad574a23e64caa6aa71b3cf98fe1bfee356ef42455e2492218f9aeea'}]}, 'timestamp': '2026-02-20 09:57:18.308522', '_unique_id': 'c6982f6f308e4f0194758fbe9301fc31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd753d21-9c91-4011-b65b-61412fd84e97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.311168', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab796d4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '4fd41f4007db6642000431373d65409315fc28e78ed98a4d5ae6dd843c93052b'}]}, 'timestamp': '2026-02-20 09:57:18.311625', '_unique_id': '31dddf4b846b4c28a93104ad19bde4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7971ab64-fe84-4646-98d8-e1846379142a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.313845', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab8020e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '52a08005e3d451e2ab573f356d6947195c55cb83a72afa8056eaa844354c8817'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.313845', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab811c2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': 'ed6e37d3806e040bfe8aaaa2d4adac283bd10fd65607aca3dbb3f03de3fb7982'}]}, 'timestamp': '2026-02-20 09:57:18.314770', '_unique_id': 'f3626e1b56664dffb75540dd3a3eea71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1899c9a3-6059-4dd8-9911-39d769c85482', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.316973', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab87c2a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '35a70c6ff7e507410b46c74163d7812b8c9a65152d4dd0456a1f25abb52d37ab'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.316973', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab88f76-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '644e92abb34af1d64ccf73686d52ee83782dc3ee264badb5b0498b7cc01764c5'}]}, 'timestamp': '2026-02-20 09:57:18.317964', '_unique_id': '04cb3f3217d040f78654d91a674311cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c72e93a-5383-4bfe-8be6-6889162b3ffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.319488', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab8d8a0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '70635494c011ae197f9e311a0db383d2928ed897905c798e29cb87ca25ed8323'}]}, 'timestamp': '2026-02-20 09:57:18.319805', '_unique_id': '11106eaef8f447339cd40377b6e776df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52490b69-b223-4c6e-b5bc-df12c322686e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.321343', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab921b6-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '5a67188ed07cba673f7dc33fe4260f4ff117f11ae887123ef0be03261fa3353f'}]}, 'timestamp': '2026-02-20 09:57:18.321678', '_unique_id': '49cacc562e4e4d8799191c427185d54b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.323 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c380bce-398a-4bc4-9d64-51a6cd79e4bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.323137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab96900-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '7880d2359bdda27ceac9dfd6cc4388e5b2e4c403c4aa11970c735e799f30287c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.323137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab9742c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '4d02e8e4631ceb72518b2507faa5c3094b4d9a5a4dd837f467d09dfc169784f4'}]}, 'timestamp': '2026-02-20 09:57:18.323769', '_unique_id': '6e9e7fc905bc48a38874f723e0e6c5aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f4eac1e-01f1-407c-8915-5f2b00f6cc31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.325276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab9bd9c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '810fee5f6a2d35ac0163726aece82b550188ba6272787c9d3ad506c38f732a7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.325276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab9c936-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '07d1778aea4d39e13c10adc1f01d62b89fde6debb6a226b909e10c754f890da9'}]}, 'timestamp': '2026-02-20 09:57:18.325927', '_unique_id': 'd1db5c6b355246bf8e509b8ab20988d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 18240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a182bd07-a8e3-46bb-ad81-c6fab47771b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18240000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:57:18.327340', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8aba0d9c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.5337437, 'message_signature': '0fe9bd7258dd60ce865c8c80eedf0d103f0110bb401bcbe14b8ee8570beada89'}]}, 'timestamp': '2026-02-20 09:57:18.327705', '_unique_id': '900b74ad02444f96b14d11ed3bd418cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ef04b25-f6f6-44ac-89a6-12e0d7b131db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.329084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8aba4f32-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '67c86543522c570a0e70efdfeed399cb9b84f08dcf79bc83ad480d2a812775cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.329084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8aba5c5c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '46cfd91bef4b03702dcaf6f95768c57db7c4e697f89ede9ef3d242dc3e0b2fe2'}]}, 'timestamp': '2026-02-20 09:57:18.329717', '_unique_id': 'd790f24088b94f11aa0f651fa8ca6021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b52dca2-cac5-412b-9f46-b50f2e69ba04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.331132', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8aba9f78-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '93f691f32a3aeed537d22049fe467588ba44c3f655dc01706f9a102141d0b59d'}]}, 'timestamp': '2026-02-20 09:57:18.331428', '_unique_id': '0d160842cc6c4b5b862c1b3577b3bed2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:57:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 04:57:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Feb 20 04:57:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 20 04:57:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Feb 20 04:57:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e176 e176: 6 total, 6 up, 6 in Feb 20 04:57:19 localhost nova_compute[281288]: 2026-02-20 09:57:19.700 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:19 localhost nova_compute[281288]: 2026-02-20 09:57:19.705 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:20.850 2 INFO neutron.agent.securitygroups_rpc [None req-41f9c3ce-b340-465f-aa72-7be8aab7d24c 3fd5694d6e624148892ddc3041d2f0e1 4bc7f22347de4004b73776eab4064bd0 - - default default] Security group member updated ['c599d16d-0283-4cf2-8a39-4a506ff8f2f0']#033[00m Feb 20 04:57:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 20 04:57:21 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Feb 20 04:57:21 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:57:21 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:57:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e177 e177: 6 total, 6 up, 6 in Feb 20 04:57:21 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:21.736 2 INFO neutron.agent.securitygroups_rpc [None req-84019169-f531-4b37-ab25-b8fba57ed27f e8d99e5aba074cfb8aea01d99045d2af 8a08202c1391432d972dc0430612e0e0 - - default default] Security group member updated ['49b521a4-2cce-4f1a-b690-2fa2cab68db5']#033[00m Feb 20 04:57:21 localhost dnsmasq[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/addn_hosts - 0 addresses Feb 20 04:57:21 localhost dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/host Feb 20 04:57:21 localhost podman[318211]: 2026-02-20 09:57:21.998158391 +0000 UTC m=+0.060285358 container kill edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:57:22 localhost dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/opts Feb 20 04:57:22 localhost dnsmasq[318135]: exiting on receipt of SIGTERM Feb 20 04:57:22 localhost podman[318251]: 2026-02-20 09:57:22.718912111 +0000 UTC m=+0.065965281 container kill edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:57:22 localhost systemd[1]: tmp-crun.iVbAjW.mount: Deactivated successfully. Feb 20 04:57:22 localhost systemd[1]: libpod-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38.scope: Deactivated successfully. Feb 20 04:57:22 localhost podman[318265]: 2026-02-20 09:57:22.810538742 +0000 UTC m=+0.076907904 container died edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:57:22 localhost podman[318265]: 2026-02-20 09:57:22.848710225 +0000 UTC m=+0.115079327 container cleanup edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:57:22 localhost systemd[1]: libpod-conmon-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38.scope: Deactivated successfully. Feb 20 04:57:22 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e178 e178: 6 total, 6 up, 6 in Feb 20 04:57:22 localhost podman[318267]: 2026-02-20 09:57:22.881787593 +0000 UTC m=+0.135699146 container remove edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:57:22 localhost kernel: device tapc16513af-fd left promiscuous mode Feb 20 04:57:22 localhost ovn_controller[156798]: 2026-02-20T09:57:22Z|00326|binding|INFO|Releasing lport c16513af-fdad-437e-89f6-e90f98f0836a from this chassis (sb_readonly=0) Feb 20 04:57:22 localhost ovn_controller[156798]: 2026-02-20T09:57:22Z|00327|binding|INFO|Setting lport c16513af-fdad-437e-89f6-e90f98f0836a down in Southbound Feb 20 04:57:22 localhost nova_compute[281288]: 2026-02-20 09:57:22.932 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:22 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:22.942 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a08202c1391432d972dc0430612e0e0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=edefd936-7bd3-45c5-ab80-3ad63680dbf7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c16513af-fdad-437e-89f6-e90f98f0836a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:22 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:22.944 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c16513af-fdad-437e-89f6-e90f98f0836a in datapath 4eca95b1-f334-4c45-8797-de13f5964062 unbound from our chassis#033[00m Feb 20 04:57:22 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:22.948 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4eca95b1-f334-4c45-8797-de13f5964062, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:57:22 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:22.951 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[11407500-704d-4ddf-b30e-29d2de4f5778]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:22 localhost nova_compute[281288]: 2026-02-20 09:57:22.953 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:22 localhost systemd[1]: var-lib-containers-storage-overlay-b81691ba7110d0b98058372d9f0498590f655b1c412c70b155fbfe7659ea93fa-merged.mount: Deactivated successfully. Feb 20 04:57:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38-userdata-shm.mount: Deactivated successfully. Feb 20 04:57:23 localhost systemd[1]: run-netns-qdhcp\x2d4eca95b1\x2df334\x2d4c45\x2d8797\x2dde13f5964062.mount: Deactivated successfully. Feb 20 04:57:23 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:23.297 264355 INFO neutron.agent.dhcp.agent [None req-99012659-8adf-42c6-8bf6-c470f47099ca - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:23 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:23.297 264355 INFO neutron.agent.dhcp.agent [None req-99012659-8adf-42c6-8bf6-c470f47099ca - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:23 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:23.428 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:23 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Feb 20 04:57:23 localhost ovn_controller[156798]: 2026-02-20T09:57:23Z|00328|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:57:23 localhost nova_compute[281288]: 2026-02-20 09:57:23.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e179 e179: 6 total, 6 up, 6 in Feb 20 04:57:24 localhost nova_compute[281288]: 2026-02-20 09:57:24.738 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:24 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Feb 20 04:57:24 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 20 04:57:24 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Feb 20 04:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:57:25 localhost podman[318294]: 2026-02-20 09:57:25.15493305 +0000 UTC m=+0.089403965 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:57:25 localhost podman[318294]: 2026-02-20 09:57:25.179454188 +0000 UTC m=+0.113925153 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:57:25 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:57:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e180 e180: 6 total, 6 up, 6 in Feb 20 04:57:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:26 localhost openstack_network_exporter[244414]: ERROR 09:57:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:57:26 localhost openstack_network_exporter[244414]: Feb 20 04:57:26 localhost openstack_network_exporter[244414]: ERROR 09:57:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:57:26 localhost openstack_network_exporter[244414]: Feb 20 04:57:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e181 e181: 6 total, 6 up, 6 in Feb 20 04:57:27 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e182 e182: 6 total, 6 up, 6 in Feb 20 04:57:28 localhost sshd[318317]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:57:29 localhost nova_compute[281288]: 2026-02-20 09:57:29.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:57:29 localhost nova_compute[281288]: 2026-02-20 09:57:29.742 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:57:29 localhost nova_compute[281288]: 2026-02-20 09:57:29.742 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:57:29 localhost nova_compute[281288]: 2026-02-20 09:57:29.742 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:57:29 localhost nova_compute[281288]: 2026-02-20 09:57:29.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:29 localhost nova_compute[281288]: 2026-02-20 09:57:29.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:57:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:57:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Feb 20 04:57:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 20 04:57:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Feb 20 04:57:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e183 e183: 6 total, 6 up, 6 in Feb 20 04:57:30 localhost podman[318319]: 2026-02-20 09:57:30.150752994 +0000 UTC m=+0.090215870 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:57:30 localhost podman[318319]: 2026-02-20 09:57:30.197137727 +0000 UTC m=+0.136600613 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 04:57:30 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:57:30 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:30.751 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:32 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e184 e184: 6 total, 6 up, 6 in Feb 20 04:57:32 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e185 e185: 6 total, 6 up, 6 in Feb 20 04:57:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:57:33 localhost podman[318342]: 2026-02-20 09:57:33.155735509 +0000 UTC m=+0.085236398 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.) Feb 20 04:57:33 localhost podman[318342]: 2026-02-20 09:57:33.203102032 +0000 UTC m=+0.132602931 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1770267347) Feb 20 04:57:33 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:57:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:57:33 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2716620294' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:57:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:57:33 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2716620294' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:57:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:33.411 2 INFO neutron.agent.securitygroups_rpc [None req-203ffc30-16ac-4832-a92b-6d9503978c8f fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']#033[00m Feb 20 04:57:33 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:33.796 2 INFO neutron.agent.securitygroups_rpc [None req-203ffc30-16ac-4832-a92b-6d9503978c8f fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']#033[00m Feb 20 04:57:34 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e186 e186: 6 total, 6 up, 6 in Feb 20 04:57:34 localhost nova_compute[281288]: 2026-02-20 09:57:34.770 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:57:34 localhost nova_compute[281288]: 2026-02-20 09:57:34.773 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:57:34 localhost nova_compute[281288]: 2026-02-20 09:57:34.773 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:57:34 localhost nova_compute[281288]: 2026-02-20 09:57:34.773 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:57:34 localhost nova_compute[281288]: 2026-02-20 09:57:34.787 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:34 localhost nova_compute[281288]: 2026-02-20 09:57:34.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:57:34 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:34.813 2 INFO neutron.agent.securitygroups_rpc [None req-3f70ffff-dbcf-4468-97f0-19b384f44318 fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']#033[00m Feb 20 04:57:34 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:34.854 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:35 localhost neutron_sriov_agent[257177]: 2026-02-20 09:57:35.473 2 INFO neutron.agent.securitygroups_rpc [None req-16cacc97-a725-4413-b643-7033155fe483 fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']#033[00m Feb 20 04:57:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 e187: 6 total, 6 up, 6 in Feb 20 04:57:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:57:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:57:39 localhost podman[318362]: 2026-02-20 09:57:39.151231088 +0000 UTC m=+0.083489024 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:57:39 localhost systemd[1]: tmp-crun.dFdZw3.mount: Deactivated successfully. Feb 20 04:57:39 localhost podman[318363]: 2026-02-20 09:57:39.247882623 +0000 UTC m=+0.171305580 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 04:57:39 localhost podman[318362]: 2026-02-20 09:57:39.255044312 +0000 UTC m=+0.187302208 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller) Feb 20 04:57:39 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:57:39 localhost podman[318363]: 2026-02-20 09:57:39.286262353 +0000 UTC m=+0.209685350 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:57:39 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:57:39 localhost nova_compute[281288]: 2026-02-20 09:57:39.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:39 localhost nova_compute[281288]: 2026-02-20 09:57:39.790 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:40 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:40.488 264355 INFO neutron.agent.linux.ip_lib [None req-7e3a4af5-26d0-4b97-ad9d-9cb19ae10ef5 - - - - - -] Device tapfc50cf28-29 cannot be used as it has no MAC address#033[00m Feb 20 04:57:40 localhost nova_compute[281288]: 2026-02-20 09:57:40.550 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:40 localhost kernel: device tapfc50cf28-29 entered promiscuous mode Feb 20 04:57:40 localhost NetworkManager[5988]: [1771581460.5581] manager: (tapfc50cf28-29): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Feb 20 04:57:40 localhost nova_compute[281288]: 2026-02-20 09:57:40.558 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:40 localhost ovn_controller[156798]: 2026-02-20T09:57:40Z|00329|binding|INFO|Claiming lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 for this chassis. Feb 20 04:57:40 localhost ovn_controller[156798]: 2026-02-20T09:57:40Z|00330|binding|INFO|fc50cf28-29d3-47fd-a51c-af45ce6bd7a1: Claiming unknown Feb 20 04:57:40 localhost systemd-udevd[318416]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:57:40 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:40.575 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92c7b74ddebf4a4a82ffeb6b9b8a9111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c5062c6-b980-4e27-a02e-d7f042725453, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc50cf28-29d3-47fd-a51c-af45ce6bd7a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:40 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:40.577 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 in datapath 530c1fe2-b1ac-42bf-9f43-13da698642f0 bound to our chassis#033[00m Feb 20 04:57:40 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:40.581 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 530c1fe2-b1ac-42bf-9f43-13da698642f0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:57:40 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:40.583 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[860e2e57-2086-4a8c-88e7-ab9e38c5cd58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost ovn_controller[156798]: 2026-02-20T09:57:40Z|00331|binding|INFO|Setting lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 ovn-installed in OVS Feb 20 04:57:40 localhost ovn_controller[156798]: 2026-02-20T09:57:40Z|00332|binding|INFO|Setting lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 up in Southbound Feb 20 04:57:40 localhost nova_compute[281288]: 2026-02-20 09:57:40.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost journal[229984]: ethtool ioctl error on tapfc50cf28-29: No such device Feb 20 04:57:40 localhost nova_compute[281288]: 2026-02-20 09:57:40.647 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:40 localhost nova_compute[281288]: 2026-02-20 09:57:40.679 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:41 localhost ovn_controller[156798]: 2026-02-20T09:57:41Z|00333|binding|INFO|Removing iface tapfc50cf28-29 ovn-installed in OVS Feb 20 04:57:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:41.111 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 868e8685-06f4-43ca-b5fe-c6d154a2841f with type ""#033[00m Feb 20 04:57:41 localhost ovn_controller[156798]: 2026-02-20T09:57:41Z|00334|binding|INFO|Removing lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 ovn-installed in OVS Feb 20 04:57:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:41.113 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92c7b74ddebf4a4a82ffeb6b9b8a9111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c5062c6-b980-4e27-a02e-d7f042725453, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc50cf28-29d3-47fd-a51c-af45ce6bd7a1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:41 localhost nova_compute[281288]: 2026-02-20 09:57:41.113 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:41.116 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 in datapath 530c1fe2-b1ac-42bf-9f43-13da698642f0 unbound from our chassis#033[00m Feb 20 04:57:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:41.118 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 530c1fe2-b1ac-42bf-9f43-13da698642f0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:57:41 localhost nova_compute[281288]: 2026-02-20 09:57:41.118 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:41 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:41.118 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1a291e-092b-44b1-8ce2-4cf585bcb271]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:41 localhost podman[318487]: Feb 20 04:57:41 localhost ovn_controller[156798]: 2026-02-20T09:57:41Z|00335|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:57:41 localhost podman[318487]: 2026-02-20 09:57:41.546219609 +0000 UTC m=+0.091463167 container create 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 04:57:41 localhost nova_compute[281288]: 2026-02-20 09:57:41.560 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:41 localhost systemd[1]: Started libpod-conmon-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460.scope. Feb 20 04:57:41 localhost podman[318487]: 2026-02-20 09:57:41.501511667 +0000 UTC m=+0.046755255 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:57:41 localhost systemd[1]: Started libcrun container. Feb 20 04:57:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c57f5625fab7ddc77f2282f6cf9f71a784ec5eef9d14cbb7d029b3c1e11247/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:57:41 localhost podman[318487]: 2026-02-20 09:57:41.621188503 +0000 UTC m=+0.166432071 container init 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:57:41 localhost podman[318487]: 2026-02-20 09:57:41.629874448 +0000 UTC m=+0.175118016 container start 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 04:57:41 localhost dnsmasq[318505]: started, version 2.85 cachesize 150 Feb 20 04:57:41 localhost dnsmasq[318505]: DNS service limited to local subnets Feb 20 04:57:41 localhost dnsmasq[318505]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:57:41 localhost dnsmasq[318505]: warning: no upstream servers configured Feb 20 04:57:41 localhost dnsmasq-dhcp[318505]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 20 04:57:41 localhost dnsmasq[318505]: read /var/lib/neutron/dhcp/530c1fe2-b1ac-42bf-9f43-13da698642f0/addn_hosts - 0 addresses Feb 20 04:57:41 localhost dnsmasq-dhcp[318505]: read /var/lib/neutron/dhcp/530c1fe2-b1ac-42bf-9f43-13da698642f0/host Feb 20 04:57:41 localhost dnsmasq-dhcp[318505]: read /var/lib/neutron/dhcp/530c1fe2-b1ac-42bf-9f43-13da698642f0/opts Feb 20 04:57:41 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:41.741 264355 INFO neutron.agent.dhcp.agent [None req-35c9fc8a-e339-472e-affe-e03a51503b53 - - - - - -] DHCP configuration for ports {'dfe491bd-7fe5-4be1-94d7-71260e8282f7'} is completed#033[00m Feb 20 04:57:41 localhost dnsmasq[318505]: exiting on receipt of SIGTERM Feb 20 04:57:41 localhost podman[318528]: 2026-02-20 09:57:41.875269625 +0000 UTC m=+0.060277488 container kill 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:57:41 localhost systemd[1]: libpod-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460.scope: Deactivated successfully. Feb 20 04:57:41 localhost podman[318554]: 2026-02-20 09:57:41.952143967 +0000 UTC m=+0.059314178 container died 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:57:42 localhost podman[318554]: 2026-02-20 09:57:42.003677228 +0000 UTC m=+0.110847399 container cleanup 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:57:42 localhost systemd[1]: libpod-conmon-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460.scope: Deactivated successfully. Feb 20 04:57:42 localhost podman[318556]: 2026-02-20 09:57:42.082819809 +0000 UTC m=+0.184564955 container remove 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:57:42 localhost kernel: device tapfc50cf28-29 left promiscuous mode Feb 20 04:57:42 localhost nova_compute[281288]: 2026-02-20 09:57:42.098 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:42 localhost nova_compute[281288]: 2026-02-20 09:57:42.116 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:42.145 264355 INFO neutron.agent.dhcp.agent [None req-b5e18cee-fe6c-4955-acf8-2f35280e794b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:42.146 264355 INFO neutron.agent.dhcp.agent [None req-b5e18cee-fe6c-4955-acf8-2f35280e794b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:42 localhost systemd[1]: var-lib-containers-storage-overlay-a6c57f5625fab7ddc77f2282f6cf9f71a784ec5eef9d14cbb7d029b3c1e11247-merged.mount: Deactivated successfully. Feb 20 04:57:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460-userdata-shm.mount: Deactivated successfully. Feb 20 04:57:42 localhost systemd[1]: run-netns-qdhcp\x2d530c1fe2\x2db1ac\x2d42bf\x2d9f43\x2d13da698642f0.mount: Deactivated successfully. Feb 20 04:57:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:42.786 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:57:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:57:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:57:44 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:57:44 localhost nova_compute[281288]: 2026-02-20 09:57:44.829 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:57:46 localhost podman[318653]: 2026-02-20 09:57:46.146177409 +0000 UTC m=+0.083691880 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Feb 20 04:57:46 localhost podman[318653]: 2026-02-20 09:57:46.157974279 +0000 UTC m=+0.095488730 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:57:46 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:57:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:47 localhost podman[241968]: time="2026-02-20T09:57:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:57:47 localhost podman[241968]: @ - - [20/Feb/2026:09:57:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:57:47 localhost podman[241968]: @ - - [20/Feb/2026:09:57:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18342 "" "Go-http-client/1.1" Feb 20 04:57:49 localhost nova_compute[281288]: 2026-02-20 09:57:49.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:49 localhost nova_compute[281288]: 2026-02-20 09:57:49.834 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:52 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e188 e188: 6 total, 6 up, 6 in Feb 20 04:57:53 localhost sshd[318673]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:57:54 localhost nova_compute[281288]: 2026-02-20 09:57:54.836 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:57:54 localhost nova_compute[281288]: 2026-02-20 09:57:54.839 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:57:54 localhost nova_compute[281288]: 2026-02-20 09:57:54.839 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:57:54 localhost nova_compute[281288]: 2026-02-20 09:57:54.839 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:57:54 localhost nova_compute[281288]: 2026-02-20 09:57:54.872 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:54 localhost nova_compute[281288]: 2026-02-20 09:57:54.873 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:57:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e189 e189: 6 total, 6 up, 6 in Feb 20 04:57:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:57:55 localhost podman[318675]: 2026-02-20 09:57:55.337519401 +0000 UTC m=+0.091341763 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:57:55 localhost podman[318675]: 2026-02-20 09:57:55.352353654 +0000 UTC m=+0.106176056 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:57:55 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:57:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:57:56 localhost openstack_network_exporter[244414]: ERROR 09:57:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:57:56 localhost openstack_network_exporter[244414]: Feb 20 04:57:56 localhost openstack_network_exporter[244414]: ERROR 09:57:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:57:56 localhost openstack_network_exporter[244414]: Feb 20 04:57:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:57:56 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3816358087' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:57:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:57:56 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3816358087' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:57:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:56.803 264355 INFO neutron.agent.linux.ip_lib [None req-e4a8c918-57fc-4c1c-9320-241fb609b547 - - - - - -] Device tapf3db41a2-50 cannot be used as it has no MAC address#033[00m Feb 20 04:57:56 localhost nova_compute[281288]: 2026-02-20 09:57:56.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:56 localhost kernel: device tapf3db41a2-50 entered promiscuous mode Feb 20 04:57:56 localhost NetworkManager[5988]: [1771581476.8388] manager: (tapf3db41a2-50): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Feb 20 04:57:56 localhost ovn_controller[156798]: 2026-02-20T09:57:56Z|00336|binding|INFO|Claiming lport f3db41a2-50e3-4148-b9ec-2d158c7c524f for this chassis. Feb 20 04:57:56 localhost ovn_controller[156798]: 2026-02-20T09:57:56Z|00337|binding|INFO|f3db41a2-50e3-4148-b9ec-2d158c7c524f: Claiming unknown Feb 20 04:57:56 localhost nova_compute[281288]: 2026-02-20 09:57:56.842 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:56 localhost systemd-udevd[318709]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:57:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:56.851 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d774fdc5-de30-406f-8a1c-54d7b2f97bb3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f3db41a2-50e3-4148-b9ec-2d158c7c524f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:56.853 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f3db41a2-50e3-4148-b9ec-2d158c7c524f in datapath 4bb090fa-9f17-4be7-ae34-cbbdab18a773 bound to our chassis#033[00m Feb 20 04:57:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:56.855 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4bb090fa-9f17-4be7-ae34-cbbdab18a773 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:57:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:56.856 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c831b79e-9a22-4427-877f-21d9db4c8d80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost ovn_controller[156798]: 2026-02-20T09:57:56Z|00338|binding|INFO|Setting lport f3db41a2-50e3-4148-b9ec-2d158c7c524f ovn-installed in OVS Feb 20 04:57:56 localhost ovn_controller[156798]: 2026-02-20T09:57:56Z|00339|binding|INFO|Setting lport f3db41a2-50e3-4148-b9ec-2d158c7c524f up in Southbound Feb 20 04:57:56 localhost nova_compute[281288]: 2026-02-20 09:57:56.881 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost journal[229984]: ethtool ioctl error on tapf3db41a2-50: No such device Feb 20 04:57:56 localhost nova_compute[281288]: 2026-02-20 09:57:56.924 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:56 localhost nova_compute[281288]: 2026-02-20 09:57:56.955 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:57.327 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:57 localhost nova_compute[281288]: 2026-02-20 09:57:57.328 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:57 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:57.329 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:57:57 localhost nova_compute[281288]: 2026-02-20 09:57:57.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:57 localhost podman[318776]: Feb 20 04:57:57 localhost podman[318776]: 2026-02-20 09:57:57.900290414 +0000 UTC m=+0.097440060 container create 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:57:57 localhost systemd[1]: Started libpod-conmon-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7.scope. Feb 20 04:57:57 localhost podman[318776]: 2026-02-20 09:57:57.855068477 +0000 UTC m=+0.052218183 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:57:57 localhost systemd[1]: Started libcrun container. Feb 20 04:57:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d2a4d44a48183477bfe77e2460f5a0a928460e76df8182294ed0bcb1b4a2e9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:57:57 localhost podman[318776]: 2026-02-20 09:57:57.982981604 +0000 UTC m=+0.180131260 container init 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:57:57 localhost podman[318776]: 2026-02-20 09:57:57.993325199 +0000 UTC m=+0.190474865 container start 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:57:58 localhost dnsmasq[318794]: started, version 2.85 cachesize 150 Feb 20 04:57:58 localhost dnsmasq[318794]: DNS service limited to local subnets Feb 20 04:57:58 localhost dnsmasq[318794]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:57:58 localhost dnsmasq[318794]: warning: no upstream servers configured Feb 20 04:57:58 localhost dnsmasq-dhcp[318794]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:57:58 localhost dnsmasq[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/addn_hosts - 0 addresses Feb 20 04:57:58 localhost dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/host Feb 20 04:57:58 localhost dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/opts Feb 20 04:57:58 localhost ovn_controller[156798]: 2026-02-20T09:57:58Z|00340|binding|INFO|Removing iface tapf3db41a2-50 ovn-installed in OVS Feb 20 04:57:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:58.127 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bc145312-347e-4752-96cb-e00a86df8e63 with type ""#033[00m Feb 20 04:57:58 localhost ovn_controller[156798]: 2026-02-20T09:57:58Z|00341|binding|INFO|Removing lport f3db41a2-50e3-4148-b9ec-2d158c7c524f ovn-installed in OVS Feb 20 04:57:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:58.129 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d774fdc5-de30-406f-8a1c-54d7b2f97bb3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f3db41a2-50e3-4148-b9ec-2d158c7c524f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:57:58 localhost nova_compute[281288]: 2026-02-20 09:57:58.129 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:58.132 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f3db41a2-50e3-4148-b9ec-2d158c7c524f in datapath 4bb090fa-9f17-4be7-ae34-cbbdab18a773 unbound from our chassis#033[00m Feb 20 04:57:58 localhost nova_compute[281288]: 2026-02-20 09:57:58.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:58.134 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4bb090fa-9f17-4be7-ae34-cbbdab18a773, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:57:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:57:58.135 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4bb646-b22c-41b0-8f29-b3abe5ae2b14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:57:58 localhost kernel: device tapf3db41a2-50 left promiscuous mode Feb 20 04:57:58 localhost nova_compute[281288]: 2026-02-20 09:57:58.149 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.198 264355 INFO neutron.agent.dhcp.agent [None req-ce1ef91b-fd12-444e-a3b8-5d109028a4ed - - - - - -] DHCP configuration for ports {'9bf53a91-5c17-4ed0-911d-ea82253cfbbe'} is completed#033[00m Feb 20 04:57:58 localhost dnsmasq[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/addn_hosts - 0 addresses Feb 20 04:57:58 localhost dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/host Feb 20 04:57:58 localhost podman[318814]: 2026-02-20 09:57:58.721264598 +0000 UTC m=+0.051894992 container kill 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:57:58 localhost dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/opts Feb 20 04:57:58 localhost nova_compute[281288]: 2026-02-20 09:57:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent [None req-31c6e7db-28d2-424f-ab15-dd0d4deffca8 - - - - - -] Unable to reload_allocations dhcp for 4bb090fa-9f17-4be7-ae34-cbbdab18a773.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf3db41a2-50 not found in namespace qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773. Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent return fut.result() Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent raise self._exception Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf3db41a2-50 not found in namespace qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773. Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent #033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.748 264355 INFO neutron.agent.dhcp.agent [None req-c2bf4659-23a7-4d94-a36c-8a2e456b8aa4 - - - - - -] Synchronizing state#033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.872 264355 INFO neutron.agent.dhcp.agent [None req-4e01f771-80af-4303-ac93-faffa05069c8 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.874 264355 INFO neutron.agent.dhcp.agent [-] Starting network 4bb090fa-9f17-4be7-ae34-cbbdab18a773 dhcp configuration#033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.874 264355 INFO neutron.agent.dhcp.agent [-] Finished network 4bb090fa-9f17-4be7-ae34-cbbdab18a773 dhcp configuration#033[00m Feb 20 04:57:58 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.875 264355 INFO neutron.agent.dhcp.agent [None req-4e01f771-80af-4303-ac93-faffa05069c8 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:57:59 localhost dnsmasq[318794]: exiting on receipt of SIGTERM Feb 20 04:57:59 localhost systemd[1]: tmp-crun.6KX25h.mount: Deactivated successfully. Feb 20 04:57:59 localhost podman[318842]: 2026-02-20 09:57:59.174337453 +0000 UTC m=+0.069298633 container kill 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:57:59 localhost systemd[1]: libpod-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7.scope: Deactivated successfully. Feb 20 04:57:59 localhost ovn_controller[156798]: 2026-02-20T09:57:59Z|00342|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:59 localhost podman[318858]: 2026-02-20 09:57:59.294198875 +0000 UTC m=+0.093366387 container died 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:57:59 localhost systemd[1]: tmp-crun.Wa8jan.mount: Deactivated successfully. Feb 20 04:57:59 localhost podman[318858]: 2026-02-20 09:57:59.391568882 +0000 UTC m=+0.190736374 container remove 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:57:59 localhost systemd[1]: libpod-conmon-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7.scope: Deactivated successfully. Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.754 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.755 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.756 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.756 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.756 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:57:59 localhost nova_compute[281288]: 2026-02-20 09:57:59.873 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:57:59 localhost systemd[1]: var-lib-containers-storage-overlay-8d2a4d44a48183477bfe77e2460f5a0a928460e76df8182294ed0bcb1b4a2e9a-merged.mount: Deactivated successfully. Feb 20 04:57:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7-userdata-shm.mount: Deactivated successfully. Feb 20 04:57:59 localhost systemd[1]: run-netns-qdhcp\x2d4bb090fa\x2d9f17\x2d4be7\x2dae34\x2dcbbdab18a773.mount: Deactivated successfully. Feb 20 04:58:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:58:00 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4041107687' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.268 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.364 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.365 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.589 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.590 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11297MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.591 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.591 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.669 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.670 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.670 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:58:00 localhost nova_compute[281288]: 2026-02-20 09:58:00.703 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:58:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:58:00 localhost podman[318906]: 2026-02-20 09:58:00.887708346 +0000 UTC m=+0.101601547 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:58:00 localhost podman[318906]: 2026-02-20 09:58:00.921937639 +0000 UTC m=+0.135830870 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 04:58:00 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:58:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:58:01 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2856583038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:58:01 localhost nova_compute[281288]: 2026-02-20 09:58:01.208 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:58:01 localhost nova_compute[281288]: 2026-02-20 09:58:01.216 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:58:01 localhost nova_compute[281288]: 2026-02-20 09:58:01.233 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:58:01 localhost nova_compute[281288]: 2026-02-20 09:58:01.235 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:58:01 localhost nova_compute[281288]: 2026-02-20 09:58:01.236 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:58:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:02 localhost nova_compute[281288]: 2026-02-20 09:58:02.236 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:02 localhost nova_compute[281288]: 2026-02-20 09:58:02.237 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:02 localhost nova_compute[281288]: 2026-02-20 09:58:02.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e190 e190: 6 total, 6 up, 6 in Feb 20 04:58:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:58:03 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1636974104' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:58:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:58:03 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1636974104' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:58:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:58:03 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:03.612 264355 INFO neutron.agent.linux.ip_lib [None req-ddb54f48-b62f-4e06-907e-8322adf99f73 - - - - - -] Device tapb3eab012-76 cannot be used as it has no MAC address#033[00m Feb 20 04:58:03 localhost podman[318951]: 2026-02-20 09:58:03.632912418 +0000 UTC m=+0.081976270 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.642 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:03 localhost kernel: device tapb3eab012-76 entered promiscuous mode Feb 20 04:58:03 localhost ovn_controller[156798]: 2026-02-20T09:58:03Z|00343|binding|INFO|Claiming lport b3eab012-7666-4655-9b51-e4c7e9621497 for this chassis. Feb 20 04:58:03 localhost NetworkManager[5988]: [1771581483.6526] manager: (tapb3eab012-76): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Feb 20 04:58:03 localhost ovn_controller[156798]: 2026-02-20T09:58:03Z|00344|binding|INFO|b3eab012-7666-4655-9b51-e4c7e9621497: Claiming unknown Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.652 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:03 localhost systemd-udevd[318978]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:03.663 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0e47cd34a784cbb89cbe56eafed5650', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f70c52-a5b6-4d06-85ef-b3fdc5aa9c4b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b3eab012-7666-4655-9b51-e4c7e9621497) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:03.664 162652 INFO neutron.agent.ovn.metadata.agent [-] Port b3eab012-7666-4655-9b51-e4c7e9621497 in datapath 93dbe9b6-4551-4902-9476-0f2070facdb5 bound to our chassis#033[00m Feb 20 04:58:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:03.665 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 93dbe9b6-4551-4902-9476-0f2070facdb5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:58:03 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:03.666 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf34a15-5d34-4b33-9bc6-925eae68fa2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:03 localhost podman[318951]: 2026-02-20 09:58:03.674668219 +0000 UTC m=+0.123732061 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container) Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost ovn_controller[156798]: 2026-02-20T09:58:03Z|00345|binding|INFO|Setting lport b3eab012-7666-4655-9b51-e4c7e9621497 ovn-installed in OVS Feb 20 04:58:03 localhost ovn_controller[156798]: 2026-02-20T09:58:03Z|00346|binding|INFO|Setting lport b3eab012-7666-4655-9b51-e4c7e9621497 up in Southbound Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.692 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost journal[229984]: ethtool ioctl error on tapb3eab012-76: No such device Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.737 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.738 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:58:03 localhost nova_compute[281288]: 2026-02-20 09:58:03.758 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:04 localhost podman[319049]: Feb 20 04:58:04 localhost podman[319049]: 2026-02-20 09:58:04.774155789 +0000 UTC m=+0.097156612 container create fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:04 localhost systemd[1]: Started libpod-conmon-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb.scope. Feb 20 04:58:04 localhost podman[319049]: 2026-02-20 09:58:04.732013855 +0000 UTC m=+0.055014718 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:04 localhost systemd[1]: Started libcrun container. Feb 20 04:58:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6954e2904a7005976441dfd44549da912216319c18455669f4ef3b3e79ca01de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:04 localhost podman[319049]: 2026-02-20 09:58:04.859127838 +0000 UTC m=+0.182128661 container init fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 04:58:04 localhost podman[319049]: 2026-02-20 09:58:04.870349639 +0000 UTC m=+0.193350462 container start fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:04 localhost dnsmasq[319067]: started, version 2.85 cachesize 150 Feb 20 04:58:04 localhost dnsmasq[319067]: DNS service limited to local subnets Feb 20 04:58:04 localhost dnsmasq[319067]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:04 localhost dnsmasq[319067]: warning: no upstream servers configured Feb 20 04:58:04 localhost nova_compute[281288]: 2026-02-20 09:58:04.878 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:04 localhost dnsmasq-dhcp[319067]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:04 localhost dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 0 addresses Feb 20 04:58:04 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host Feb 20 04:58:04 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts Feb 20 04:58:04 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:04.991 264355 INFO neutron.agent.dhcp.agent [None req-603e188f-4a62-4308-8e61-a2be321918fc - - - - - -] DHCP configuration for ports {'e9546fa0-e2a1-4388-b155-54ab8a4b6a66'} is completed#033[00m Feb 20 04:58:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:58:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:58:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:58:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.882 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.883 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.884 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:58:06 localhost nova_compute[281288]: 2026-02-20 09:58:06.884 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:58:07 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:07.331 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:58:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:07.920 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:58:07Z, description=, device_id=7557495a-ace7-48ee-949b-56260afaa059, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9a0d4ed4-2522-49a8-9226-4337c377056e, ip_allocation=immediate, mac_address=fa:16:3e:40:7f:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:02Z, description=, dns_domain=, id=93dbe9b6-4551-4902-9476-0f2070facdb5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-933360853-network, port_security_enabled=True, project_id=d0e47cd34a784cbb89cbe56eafed5650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16329, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2853, status=ACTIVE, subnets=['be3c004e-54c2-476b-98f9-a1eb38b39ea4'], tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:02Z, vlan_transparent=None, network_id=93dbe9b6-4551-4902-9476-0f2070facdb5, port_security_enabled=False, project_id=d0e47cd34a784cbb89cbe56eafed5650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2880, status=DOWN, tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:07Z on network 93dbe9b6-4551-4902-9476-0f2070facdb5#033[00m Feb 20 04:58:07 localhost sshd[319068]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:07 localhost nova_compute[281288]: 2026-02-20 09:58:07.948 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:58:07 localhost nova_compute[281288]: 2026-02-20 09:58:07.966 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:58:07 localhost nova_compute[281288]: 2026-02-20 09:58:07.967 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:58:08 localhost podman[319087]: 2026-02-20 09:58:08.149501629 +0000 UTC m=+0.068952372 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 20 04:58:08 localhost dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 1 addresses Feb 20 04:58:08 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host Feb 20 04:58:08 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts Feb 20 04:58:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:08.463 264355 INFO neutron.agent.dhcp.agent [None req-7e34d078-2605-4fbc-acb7-bc6902701015 - - - - - -] DHCP configuration for ports {'9a0d4ed4-2522-49a8-9226-4337c377056e'} is completed#033[00m Feb 20 04:58:09 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:09.394 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:58:07Z, description=, device_id=7557495a-ace7-48ee-949b-56260afaa059, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9a0d4ed4-2522-49a8-9226-4337c377056e, ip_allocation=immediate, mac_address=fa:16:3e:40:7f:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:02Z, description=, dns_domain=, id=93dbe9b6-4551-4902-9476-0f2070facdb5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-933360853-network, port_security_enabled=True, project_id=d0e47cd34a784cbb89cbe56eafed5650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16329, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2853, status=ACTIVE, subnets=['be3c004e-54c2-476b-98f9-a1eb38b39ea4'], tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:02Z, vlan_transparent=None, network_id=93dbe9b6-4551-4902-9476-0f2070facdb5, port_security_enabled=False, project_id=d0e47cd34a784cbb89cbe56eafed5650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2880, status=DOWN, tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:07Z on network 93dbe9b6-4551-4902-9476-0f2070facdb5#033[00m Feb 20 04:58:09 localhost dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 1 addresses Feb 20 04:58:09 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host Feb 20 04:58:09 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts Feb 20 04:58:09 localhost podman[319125]: 2026-02-20 09:58:09.598493927 +0000 UTC m=+0.069792698 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:58:09 localhost podman[319140]: 2026-02-20 09:58:09.746338242 +0000 UTC m=+0.106061423 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 04:58:09 localhost podman[319139]: 2026-02-20 09:58:09.718804663 +0000 UTC m=+0.084033572 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:58:09 localhost podman[319140]: 2026-02-20 09:58:09.779161422 +0000 UTC m=+0.138884553 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 20 04:58:09 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:58:09 localhost podman[319139]: 2026-02-20 09:58:09.802046159 +0000 UTC m=+0.167275038 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 20 04:58:09 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:58:09 localhost sshd[319190]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:09 localhost nova_compute[281288]: 2026-02-20 09:58:09.879 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:09 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:09.911 264355 INFO neutron.agent.dhcp.agent [None req-b32da0f6-1830-419d-b62f-3009676400b2 - - - - - -] DHCP configuration for ports {'9a0d4ed4-2522-49a8-9226-4337c377056e'} is completed#033[00m Feb 20 04:58:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e191 e191: 6 total, 6 up, 6 in Feb 20 04:58:11 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:11.128 264355 INFO neutron.agent.linux.ip_lib [None req-d0d0c787-ea20-4cde-846a-790e04278dc7 - - - - - -] Device tapedf7cdb1-89 cannot be used as it has no MAC address#033[00m Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.150 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost kernel: device tapedf7cdb1-89 entered promiscuous mode Feb 20 04:58:11 localhost NetworkManager[5988]: [1771581491.1594] manager: (tapedf7cdb1-89): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Feb 20 04:58:11 localhost ovn_controller[156798]: 2026-02-20T09:58:11Z|00347|binding|INFO|Claiming lport edf7cdb1-8914-4d5f-9318-a56053278a5b for this chassis. Feb 20 04:58:11 localhost ovn_controller[156798]: 2026-02-20T09:58:11Z|00348|binding|INFO|edf7cdb1-8914-4d5f-9318-a56053278a5b: Claiming unknown Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.160 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost systemd-udevd[319202]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:11.173 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=188c8087-bb75-458a-8eed-59a2b56d79c4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=edf7cdb1-8914-4d5f-9318-a56053278a5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:11 localhost ovn_controller[156798]: 2026-02-20T09:58:11Z|00349|binding|INFO|Setting lport edf7cdb1-8914-4d5f-9318-a56053278a5b ovn-installed in OVS Feb 20 04:58:11 localhost ovn_controller[156798]: 2026-02-20T09:58:11Z|00350|binding|INFO|Setting lport edf7cdb1-8914-4d5f-9318-a56053278a5b up in Southbound Feb 20 04:58:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:11.177 162652 INFO neutron.agent.ovn.metadata.agent [-] Port edf7cdb1-8914-4d5f-9318-a56053278a5b in datapath 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 bound to our chassis#033[00m Feb 20 04:58:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:11.179 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.175 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.176 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:11.180 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9a89e0a3-29bc-47bc-b119-f02c69a0596a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.212 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost journal[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.255 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost nova_compute[281288]: 2026-02-20 09:58:11.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:11 localhost sshd[319244]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e192 e192: 6 total, 6 up, 6 in Feb 20 04:58:12 localhost podman[319275]: Feb 20 04:58:12 localhost podman[319275]: 2026-02-20 09:58:12.219567946 +0000 UTC m=+0.094041327 container create 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 20 04:58:12 localhost systemd[1]: Started libpod-conmon-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d.scope. Feb 20 04:58:12 localhost systemd[1]: tmp-crun.qQzq6n.mount: Deactivated successfully. Feb 20 04:58:12 localhost podman[319275]: 2026-02-20 09:58:12.175211875 +0000 UTC m=+0.049685256 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:12 localhost systemd[1]: Started libcrun container. Feb 20 04:58:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b49eced9f4fc65c05c8c092b97ef8318ea17ed43fd0245e646a30e3aa60417/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:12 localhost podman[319275]: 2026-02-20 09:58:12.307701251 +0000 UTC m=+0.182174642 container init 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:12 localhost podman[319275]: 2026-02-20 09:58:12.31848642 +0000 UTC m=+0.192959801 container start 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:58:12 localhost dnsmasq[319293]: started, version 2.85 cachesize 150 Feb 20 04:58:12 localhost dnsmasq[319293]: DNS service limited to local subnets Feb 20 04:58:12 localhost dnsmasq[319293]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:12 localhost dnsmasq[319293]: warning: no upstream servers configured Feb 20 04:58:12 localhost dnsmasq-dhcp[319293]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:12 localhost dnsmasq[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/addn_hosts - 0 addresses Feb 20 04:58:12 localhost dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/host Feb 20 04:58:12 localhost dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/opts Feb 20 04:58:12 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:12.473 264355 INFO neutron.agent.dhcp.agent [None req-cebf620c-6ca4-460d-b7d9-780231987c77 - - - - - -] DHCP configuration for ports {'d725344c-9c12-4f8f-b2ef-5d64cff6fbc5'} is completed#033[00m Feb 20 04:58:12 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:12.562 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c3f233a6-25d7-4255-a243-46db9de6fdef with type ""#033[00m Feb 20 04:58:12 localhost ovn_controller[156798]: 2026-02-20T09:58:12Z|00351|binding|INFO|Removing iface tapedf7cdb1-89 ovn-installed in OVS Feb 20 04:58:12 localhost ovn_controller[156798]: 2026-02-20T09:58:12Z|00352|binding|INFO|Removing lport edf7cdb1-8914-4d5f-9318-a56053278a5b ovn-installed in OVS Feb 20 04:58:12 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:12.564 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=188c8087-bb75-458a-8eed-59a2b56d79c4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=edf7cdb1-8914-4d5f-9318-a56053278a5b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:12 localhost nova_compute[281288]: 2026-02-20 09:58:12.564 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:12 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:12.567 162652 INFO neutron.agent.ovn.metadata.agent [-] Port edf7cdb1-8914-4d5f-9318-a56053278a5b in datapath 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 unbound from our chassis#033[00m Feb 20 04:58:12 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:12.570 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:12 localhost nova_compute[281288]: 2026-02-20 09:58:12.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:12 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:12.571 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd196d7-69d1-4af7-8541-6ff0d73bd0e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:12 localhost nova_compute[281288]: 2026-02-20 09:58:12.574 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:12 localhost kernel: device tapedf7cdb1-89 left promiscuous mode Feb 20 04:58:12 localhost nova_compute[281288]: 2026-02-20 09:58:12.593 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:13 localhost dnsmasq[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/addn_hosts - 0 addresses Feb 20 04:58:13 localhost dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/host Feb 20 04:58:13 localhost dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/opts Feb 20 04:58:13 localhost podman[319313]: 2026-02-20 09:58:13.120309889 +0000 UTC m=+0.057951466 container kill 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent [None req-f8bf157c-3c39-41ba-9b81-8c81d1c0304b - - - - - -] Unable to reload_allocations dhcp for 80ba55c6-a1c6-4e76-942b-f70ca34ddad5.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapedf7cdb1-89 not found in namespace qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5. Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent return fut.result() Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent raise self._exception Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapedf7cdb1-89 not found in namespace qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5. Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent #033[00m Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.152 264355 INFO neutron.agent.dhcp.agent [None req-4e01f771-80af-4303-ac93-faffa05069c8 - - - - - -] Synchronizing state#033[00m Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.285 264355 INFO neutron.agent.dhcp.agent [None req-18b14ea9-72f3-48ab-a5d6-7e417394bb56 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.286 264355 INFO neutron.agent.dhcp.agent [-] Starting network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 dhcp configuration#033[00m Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.286 264355 INFO neutron.agent.dhcp.agent [-] Finished network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 dhcp configuration#033[00m Feb 20 04:58:13 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.287 264355 INFO neutron.agent.dhcp.agent [None req-18b14ea9-72f3-48ab-a5d6-7e417394bb56 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:58:13 localhost ovn_controller[156798]: 2026-02-20T09:58:13Z|00353|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:58:13 localhost nova_compute[281288]: 2026-02-20 09:58:13.429 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:13 localhost podman[319343]: 2026-02-20 09:58:13.531022924 +0000 UTC m=+0.062960000 container kill 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:13 localhost dnsmasq[319293]: exiting on receipt of SIGTERM Feb 20 04:58:13 localhost systemd[1]: libpod-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d.scope: Deactivated successfully. Feb 20 04:58:13 localhost podman[319356]: 2026-02-20 09:58:13.586263107 +0000 UTC m=+0.044299371 container died 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:13 localhost podman[319356]: 2026-02-20 09:58:13.620581062 +0000 UTC m=+0.078617296 container cleanup 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 20 04:58:13 localhost systemd[1]: libpod-conmon-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d.scope: Deactivated successfully. Feb 20 04:58:13 localhost podman[319358]: 2026-02-20 09:58:13.634398963 +0000 UTC m=+0.081744451 container remove 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:58:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e193 e193: 6 total, 6 up, 6 in Feb 20 04:58:14 localhost systemd[1]: var-lib-containers-storage-overlay-48b49eced9f4fc65c05c8c092b97ef8318ea17ed43fd0245e646a30e3aa60417-merged.mount: Deactivated successfully. Feb 20 04:58:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d-userdata-shm.mount: Deactivated successfully. Feb 20 04:58:14 localhost systemd[1]: run-netns-qdhcp\x2d80ba55c6\x2da1c6\x2d4e76\x2d942b\x2df70ca34ddad5.mount: Deactivated successfully. Feb 20 04:58:14 localhost nova_compute[281288]: 2026-02-20 09:58:14.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e194 e194: 6 total, 6 up, 6 in Feb 20 04:58:15 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Feb 20 04:58:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:16 localhost nova_compute[281288]: 2026-02-20 09:58:16.816 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:58:17 localhost podman[319385]: 2026-02-20 09:58:17.1481336 +0000 UTC m=+0.085746904 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 20 04:58:17 localhost podman[319385]: 2026-02-20 09:58:17.16422185 +0000 UTC m=+0.101835134 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:58:17 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:58:17 localhost podman[241968]: time="2026-02-20T09:58:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:58:17 localhost podman[241968]: @ - - [20/Feb/2026:09:58:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 20 04:58:17 localhost podman[241968]: @ - - [20/Feb/2026:09:58:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18823 "" "Go-http-client/1.1" Feb 20 04:58:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e195 e195: 6 total, 6 up, 6 in Feb 20 04:58:18 localhost sshd[319404]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:19 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:19.164 264355 INFO neutron.agent.linux.ip_lib [None req-64405f07-8e63-45d1-8b5e-a0d184eac1fb - - - - - -] Device tap53d7e2be-a1 cannot be used as it has no MAC address#033[00m Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.220 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:19 localhost kernel: device tap53d7e2be-a1 entered promiscuous mode Feb 20 04:58:19 localhost NetworkManager[5988]: [1771581499.2331] manager: (tap53d7e2be-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.235 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:19 localhost ovn_controller[156798]: 2026-02-20T09:58:19Z|00354|binding|INFO|Claiming lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 for this chassis. Feb 20 04:58:19 localhost ovn_controller[156798]: 2026-02-20T09:58:19Z|00355|binding|INFO|53d7e2be-a1e5-44d8-ac61-a6537aee6bf9: Claiming unknown Feb 20 04:58:19 localhost systemd-udevd[319416]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:19.249 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a0911e4-b523-4ec2-a3e4-c2816971929f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=53d7e2be-a1e5-44d8-ac61-a6537aee6bf9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:19.251 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 in datapath 375f3c2e-e621-47b1-ab46-942f46f000f6 bound to our chassis#033[00m Feb 20 04:58:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:19.253 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 375f3c2e-e621-47b1-ab46-942f46f000f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:58:19 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:19.255 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc527df-0b4a-4aa7-afae-9e64bf5b7968]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:19 localhost ovn_controller[156798]: 2026-02-20T09:58:19Z|00356|binding|INFO|Setting lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 ovn-installed in OVS Feb 20 04:58:19 localhost ovn_controller[156798]: 2026-02-20T09:58:19Z|00357|binding|INFO|Setting lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 up in Southbound Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost journal[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.314 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.347 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:19 localhost nova_compute[281288]: 2026-02-20 09:58:19.914 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:20 localhost podman[319485]: Feb 20 04:58:20 localhost podman[319485]: 2026-02-20 09:58:20.249960086 +0000 UTC m=+0.095774499 container create c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:58:20 localhost systemd[1]: Started libpod-conmon-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62.scope. Feb 20 04:58:20 localhost podman[319485]: 2026-02-20 09:58:20.204889403 +0000 UTC m=+0.050703856 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:20 localhost systemd[1]: Started libcrun container. Feb 20 04:58:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce6b4e0a927c90fbf1419bb6b55ac40e63e6fb8985741dd6b996564328cf80bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:20 localhost podman[319485]: 2026-02-20 09:58:20.329132899 +0000 UTC m=+0.174947302 container init c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:20 localhost podman[319485]: 2026-02-20 09:58:20.338067181 +0000 UTC m=+0.183881584 container start c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:58:20 localhost dnsmasq[319503]: started, version 2.85 cachesize 150 Feb 20 04:58:20 localhost dnsmasq[319503]: DNS service limited to local subnets Feb 20 04:58:20 localhost dnsmasq[319503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:20 localhost dnsmasq[319503]: warning: no upstream servers configured Feb 20 04:58:20 localhost dnsmasq-dhcp[319503]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:20 localhost dnsmasq[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/addn_hosts - 0 addresses Feb 20 04:58:20 localhost dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/host Feb 20 04:58:20 localhost dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/opts Feb 20 04:58:20 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:20.478 264355 INFO neutron.agent.dhcp.agent [None req-1f68b521-5b01-459b-8511-bc0710ce2eb4 - - - - - -] DHCP configuration for ports {'2fc13e4c-b894-4358-b28b-9fe96597fd36'} is completed#033[00m Feb 20 04:58:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:20.707 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 856512c4-e24e-424b-8516-4e8839e85823 with type ""#033[00m Feb 20 04:58:20 localhost ovn_controller[156798]: 2026-02-20T09:58:20Z|00358|binding|INFO|Removing iface tap53d7e2be-a1 ovn-installed in OVS Feb 20 04:58:20 localhost ovn_controller[156798]: 2026-02-20T09:58:20Z|00359|binding|INFO|Removing lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 ovn-installed in OVS Feb 20 04:58:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:20.710 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a0911e4-b523-4ec2-a3e4-c2816971929f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=53d7e2be-a1e5-44d8-ac61-a6537aee6bf9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:20.712 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 in datapath 375f3c2e-e621-47b1-ab46-942f46f000f6 unbound from our chassis#033[00m Feb 20 04:58:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:20.716 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 375f3c2e-e621-47b1-ab46-942f46f000f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:20 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:20.739 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d752f8-5d2e-46eb-9900-7777e2807a44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:20 localhost nova_compute[281288]: 2026-02-20 09:58:20.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:20 localhost nova_compute[281288]: 2026-02-20 09:58:20.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:20 localhost kernel: device tap53d7e2be-a1 left promiscuous mode Feb 20 04:58:20 localhost nova_compute[281288]: 2026-02-20 09:58:20.758 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:21 localhost dnsmasq[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/addn_hosts - 0 addresses Feb 20 04:58:21 localhost dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/host Feb 20 04:58:21 localhost dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/opts Feb 20 04:58:21 localhost podman[319522]: 2026-02-20 09:58:21.132842036 +0000 UTC m=+0.063768853 container kill c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent [None req-231747be-d089-4af3-a9e3-b8f752dcac4e - - - - - -] Unable to reload_allocations dhcp for 375f3c2e-e621-47b1-ab46-942f46f000f6.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap53d7e2be-a1 not found in namespace qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6. Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent return fut.result() Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent raise self._exception Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap53d7e2be-a1 not found in namespace qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6. Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent #033[00m Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.165 264355 INFO neutron.agent.dhcp.agent [None req-18b14ea9-72f3-48ab-a5d6-7e417394bb56 - - - - - -] Synchronizing state#033[00m Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.274 264355 INFO neutron.agent.dhcp.agent [None req-70d4c29f-e879-4681-ad82-c583ac60bbc8 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.274 264355 INFO neutron.agent.dhcp.agent [-] Starting network 375f3c2e-e621-47b1-ab46-942f46f000f6 dhcp configuration#033[00m Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.275 264355 INFO neutron.agent.dhcp.agent [-] Finished network 375f3c2e-e621-47b1-ab46-942f46f000f6 dhcp configuration#033[00m Feb 20 04:58:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.275 264355 INFO neutron.agent.dhcp.agent [None req-70d4c29f-e879-4681-ad82-c583ac60bbc8 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:58:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:21 localhost ovn_controller[156798]: 2026-02-20T09:58:21Z|00360|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:58:21 localhost nova_compute[281288]: 2026-02-20 09:58:21.387 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:21 localhost dnsmasq[319503]: exiting on receipt of SIGTERM Feb 20 04:58:21 localhost podman[319553]: 2026-02-20 09:58:21.544891551 +0000 UTC m=+0.057763121 container kill c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:21 localhost systemd[1]: libpod-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62.scope: Deactivated successfully. Feb 20 04:58:21 localhost podman[319566]: 2026-02-20 09:58:21.618146663 +0000 UTC m=+0.056220274 container died c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127) Feb 20 04:58:21 localhost systemd[1]: tmp-crun.lf59km.mount: Deactivated successfully. Feb 20 04:58:21 localhost podman[319566]: 2026-02-20 09:58:21.656383248 +0000 UTC m=+0.094456819 container cleanup c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:21 localhost systemd[1]: libpod-conmon-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62.scope: Deactivated successfully. Feb 20 04:58:21 localhost podman[319567]: 2026-02-20 09:58:21.693684945 +0000 UTC m=+0.125923628 container remove c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:58:22 localhost systemd[1]: var-lib-containers-storage-overlay-ce6b4e0a927c90fbf1419bb6b55ac40e63e6fb8985741dd6b996564328cf80bb-merged.mount: Deactivated successfully. Feb 20 04:58:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62-userdata-shm.mount: Deactivated successfully. Feb 20 04:58:22 localhost systemd[1]: run-netns-qdhcp\x2d375f3c2e\x2de621\x2d47b1\x2dab46\x2d942f46f000f6.mount: Deactivated successfully. Feb 20 04:58:22 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e196 e196: 6 total, 6 up, 6 in Feb 20 04:58:22 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:58:22 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:58:22 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:58:24 localhost nova_compute[281288]: 2026-02-20 09:58:24.949 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e197 e197: 6 total, 6 up, 6 in Feb 20 04:58:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:58:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 04:58:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 04:58:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:58:26 localhost systemd[1]: tmp-crun.Csi2xy.mount: Deactivated successfully. Feb 20 04:58:26 localhost podman[319594]: 2026-02-20 09:58:26.170519145 +0000 UTC m=+0.098015427 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:58:26 localhost podman[319594]: 2026-02-20 09:58:26.180125578 +0000 UTC m=+0.107621850 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 04:58:26 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:58:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:26 localhost openstack_network_exporter[244414]: ERROR 09:58:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:58:26 localhost openstack_network_exporter[244414]: Feb 20 04:58:26 localhost openstack_network_exporter[244414]: ERROR 09:58:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:58:26 localhost openstack_network_exporter[244414]: Feb 20 04:58:27 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:27.242 264355 INFO neutron.agent.linux.ip_lib [None req-d38ae07c-887e-4698-b929-469cb11c791e - - - - - -] Device tap50325231-1b cannot be used as it has no MAC address#033[00m Feb 20 04:58:27 localhost nova_compute[281288]: 2026-02-20 09:58:27.271 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:27 localhost kernel: device tap50325231-1b entered promiscuous mode Feb 20 04:58:27 localhost NetworkManager[5988]: [1771581507.2799] manager: (tap50325231-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Feb 20 04:58:27 localhost nova_compute[281288]: 2026-02-20 09:58:27.280 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:27 localhost ovn_controller[156798]: 2026-02-20T09:58:27Z|00361|binding|INFO|Claiming lport 50325231-1b72-40e0-af54-5e19218597d1 for this chassis. Feb 20 04:58:27 localhost ovn_controller[156798]: 2026-02-20T09:58:27Z|00362|binding|INFO|50325231-1b72-40e0-af54-5e19218597d1: Claiming unknown Feb 20 04:58:27 localhost systemd-udevd[319628]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:27.291 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c60e6b37-6fb4-4889-8f95-ad293363e22b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=50325231-1b72-40e0-af54-5e19218597d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:27.293 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50325231-1b72-40e0-af54-5e19218597d1 in datapath 838109c2-99d3-418e-a5f0-0558fd60210c bound to our chassis#033[00m Feb 20 04:58:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:27.295 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 838109c2-99d3-418e-a5f0-0558fd60210c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:58:27 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:27.296 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[73a967ba-5353-4145-92e9-315f9f3ee905]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:27 localhost ovn_controller[156798]: 2026-02-20T09:58:27Z|00363|binding|INFO|Setting lport 50325231-1b72-40e0-af54-5e19218597d1 ovn-installed in OVS Feb 20 04:58:27 localhost ovn_controller[156798]: 2026-02-20T09:58:27Z|00364|binding|INFO|Setting lport 50325231-1b72-40e0-af54-5e19218597d1 up in Southbound Feb 20 04:58:27 localhost nova_compute[281288]: 2026-02-20 09:58:27.319 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:27 localhost nova_compute[281288]: 2026-02-20 09:58:27.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:27 localhost nova_compute[281288]: 2026-02-20 09:58:27.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:28 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:28.330 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3e87c615-1295-4106-9d99-81bb6cceb56f with type ""#033[00m Feb 20 04:58:28 localhost ovn_controller[156798]: 2026-02-20T09:58:28Z|00365|binding|INFO|Removing iface tap50325231-1b ovn-installed in OVS Feb 20 04:58:28 localhost ovn_controller[156798]: 2026-02-20T09:58:28Z|00366|binding|INFO|Removing lport 50325231-1b72-40e0-af54-5e19218597d1 ovn-installed in OVS Feb 20 04:58:28 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:28.333 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c60e6b37-6fb4-4889-8f95-ad293363e22b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=50325231-1b72-40e0-af54-5e19218597d1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:28 localhost nova_compute[281288]: 2026-02-20 09:58:28.332 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:28 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:28.336 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50325231-1b72-40e0-af54-5e19218597d1 in datapath 838109c2-99d3-418e-a5f0-0558fd60210c unbound from our chassis#033[00m Feb 20 04:58:28 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:28.339 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 838109c2-99d3-418e-a5f0-0558fd60210c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:28 localhost nova_compute[281288]: 2026-02-20 09:58:28.341 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:28 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:28.340 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b405b11d-4509-4b96-b7e7-019b2960f356]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:28 localhost podman[319683]: Feb 20 04:58:28 localhost podman[319683]: 2026-02-20 09:58:28.380132607 +0000 UTC m=+0.094023725 container create 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Feb 20 04:58:28 localhost systemd[1]: Started libpod-conmon-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43.scope. Feb 20 04:58:28 localhost podman[319683]: 2026-02-20 09:58:28.33654815 +0000 UTC m=+0.050439258 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:28 localhost systemd[1]: Started libcrun container. Feb 20 04:58:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af85346b1dd7396e797437697a4bc476248c2106c48a90c416e25973a6117729/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:28 localhost podman[319683]: 2026-02-20 09:58:28.459199797 +0000 UTC m=+0.173090905 container init 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:28 localhost podman[319683]: 2026-02-20 09:58:28.472130981 +0000 UTC m=+0.186022089 container start 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 20 04:58:28 localhost dnsmasq[319701]: started, version 2.85 cachesize 150 Feb 20 04:58:28 localhost dnsmasq[319701]: DNS service limited to local subnets Feb 20 04:58:28 localhost dnsmasq[319701]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:28 localhost dnsmasq[319701]: warning: no upstream servers configured Feb 20 04:58:28 localhost dnsmasq-dhcp[319701]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:28 localhost dnsmasq[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/addn_hosts - 0 addresses Feb 20 04:58:28 localhost dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/host Feb 20 04:58:28 localhost dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/opts Feb 20 04:58:28 localhost kernel: device tap50325231-1b left promiscuous mode Feb 20 04:58:28 localhost nova_compute[281288]: 2026-02-20 09:58:28.600 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:28 localhost nova_compute[281288]: 2026-02-20 09:58:28.617 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.642 264355 INFO neutron.agent.dhcp.agent [None req-09c10960-720f-434d-9a7a-47df59168085 - - - - - -] DHCP configuration for ports {'5cac4abd-33fd-4748-939e-f73012dbea21'} is completed#033[00m Feb 20 04:58:28 localhost dnsmasq[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/addn_hosts - 0 addresses Feb 20 04:58:28 localhost dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/host Feb 20 04:58:28 localhost dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/opts Feb 20 04:58:28 localhost podman[319721]: 2026-02-20 09:58:28.824981021 +0000 UTC m=+0.065489476 container kill 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent [None req-1e997006-4abd-4b9d-8dde-9196ebc1ab64 - - - - - -] Unable to reload_allocations dhcp for 838109c2-99d3-418e-a5f0-0558fd60210c.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap50325231-1b not found in namespace qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c. Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent return fut.result() Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent raise self._exception Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap50325231-1b not found in namespace qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c. Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent #033[00m Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.855 264355 INFO neutron.agent.dhcp.agent [None req-70d4c29f-e879-4681-ad82-c583ac60bbc8 - - - - - -] Synchronizing state#033[00m Feb 20 04:58:28 localhost ovn_controller[156798]: 2026-02-20T09:58:28Z|00367|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:58:28 localhost nova_compute[281288]: 2026-02-20 09:58:28.976 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:28 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.992 264355 INFO neutron.agent.dhcp.agent [None req-063963d4-8eb4-458e-839e-1886c1f7eefb - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:58:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:58:29 localhost dnsmasq[319701]: exiting on receipt of SIGTERM Feb 20 04:58:29 localhost podman[319751]: 2026-02-20 09:58:29.187284161 +0000 UTC m=+0.065947011 container kill 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:29 localhost systemd[1]: libpod-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43.scope: Deactivated successfully. Feb 20 04:58:29 localhost podman[319764]: 2026-02-20 09:58:29.262193092 +0000 UTC m=+0.062415032 container died 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:58:29 localhost podman[319764]: 2026-02-20 09:58:29.293844857 +0000 UTC m=+0.094066767 container cleanup 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 04:58:29 localhost systemd[1]: libpod-conmon-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43.scope: Deactivated successfully. Feb 20 04:58:29 localhost podman[319766]: 2026-02-20 09:58:29.34972549 +0000 UTC m=+0.137208721 container remove 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 04:58:29 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:29.373 264355 INFO neutron.agent.dhcp.agent [None req-e455f8f0-087d-4011-b3b1-264293719a26 - - - - - -] Synchronizing state complete#033[00m Feb 20 04:58:29 localhost systemd[1]: var-lib-containers-storage-overlay-af85346b1dd7396e797437697a4bc476248c2106c48a90c416e25973a6117729-merged.mount: Deactivated successfully. Feb 20 04:58:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43-userdata-shm.mount: Deactivated successfully. Feb 20 04:58:29 localhost systemd[1]: run-netns-qdhcp\x2d838109c2\x2d99d3\x2d418e\x2da5f0\x2d0558fd60210c.mount: Deactivated successfully. Feb 20 04:58:29 localhost nova_compute[281288]: 2026-02-20 09:58:29.979 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:29 localhost nova_compute[281288]: 2026-02-20 09:58:29.984 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:58:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:58:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:58:30 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1881893775' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:58:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:58:30 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1881893775' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:58:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:58:31 localhost podman[319792]: 2026-02-20 09:58:31.15295974 +0000 UTC m=+0.082739671 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:58:31 localhost podman[319792]: 2026-02-20 09:58:31.191083722 +0000 UTC m=+0.120863673 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:58:31 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:58:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:32 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:32.242 264355 INFO neutron.agent.linux.ip_lib [None req-6517a283-69ca-4a9d-949a-a8fed98055d1 - - - - - -] Device tap75e94885-97 cannot be used as it has no MAC address#033[00m Feb 20 04:58:32 localhost nova_compute[281288]: 2026-02-20 09:58:32.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:32 localhost kernel: device tap75e94885-97 entered promiscuous mode Feb 20 04:58:32 localhost NetworkManager[5988]: [1771581512.2820] manager: (tap75e94885-97): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Feb 20 04:58:32 localhost nova_compute[281288]: 2026-02-20 09:58:32.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:32 localhost ovn_controller[156798]: 2026-02-20T09:58:32Z|00368|binding|INFO|Claiming lport 75e94885-97d0-4be1-9ebe-cfa150020c4f for this chassis. Feb 20 04:58:32 localhost ovn_controller[156798]: 2026-02-20T09:58:32Z|00369|binding|INFO|75e94885-97d0-4be1-9ebe-cfa150020c4f: Claiming unknown Feb 20 04:58:32 localhost systemd-udevd[319825]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:32.295 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=75e94885-97d0-4be1-9ebe-cfa150020c4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:32.296 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 75e94885-97d0-4be1-9ebe-cfa150020c4f in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b bound to our chassis#033[00m Feb 20 04:58:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:32.297 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 01006bb5-6e96-485f-99d6-c3f27965c51b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:58:32 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:32.298 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[895bdf37-5c51-4c36-8f0e-cbf59826b133]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost ovn_controller[156798]: 2026-02-20T09:58:32Z|00370|binding|INFO|Setting lport 75e94885-97d0-4be1-9ebe-cfa150020c4f ovn-installed in OVS Feb 20 04:58:32 localhost ovn_controller[156798]: 2026-02-20T09:58:32Z|00371|binding|INFO|Setting lport 75e94885-97d0-4be1-9ebe-cfa150020c4f up in Southbound Feb 20 04:58:32 localhost nova_compute[281288]: 2026-02-20 09:58:32.320 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost nova_compute[281288]: 2026-02-20 09:58:32.324 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost journal[229984]: ethtool ioctl error on tap75e94885-97: No such device Feb 20 04:58:32 localhost nova_compute[281288]: 2026-02-20 09:58:32.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:32 localhost nova_compute[281288]: 2026-02-20 09:58:32.408 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:32 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 e198: 6 total, 6 up, 6 in Feb 20 04:58:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:58:33 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1453609140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:58:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:58:33 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1453609140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:58:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:58:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 04:58:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 04:58:33 localhost podman[319896]: Feb 20 04:58:33 localhost podman[319896]: 2026-02-20 09:58:33.359777408 +0000 UTC m=+0.090537879 container create 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:58:33 localhost systemd[1]: Started libpod-conmon-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646.scope. Feb 20 04:58:33 localhost podman[319896]: 2026-02-20 09:58:33.315745576 +0000 UTC m=+0.046506087 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:33 localhost systemd[1]: Started libcrun container. Feb 20 04:58:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21ae561d6365eec38a5d27ea0386b57d32a2312c2b64cc194e30ff4e8fac29e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:33 localhost podman[319896]: 2026-02-20 09:58:33.439445265 +0000 UTC m=+0.170205776 container init 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:33 localhost podman[319896]: 2026-02-20 09:58:33.454511345 +0000 UTC m=+0.185271816 container start 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:58:33 localhost dnsmasq[319914]: started, version 2.85 cachesize 150 Feb 20 04:58:33 localhost dnsmasq[319914]: DNS service limited to local subnets Feb 20 04:58:33 localhost dnsmasq[319914]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:33 localhost dnsmasq[319914]: warning: no upstream servers configured Feb 20 04:58:33 localhost dnsmasq-dhcp[319914]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:33 localhost dnsmasq[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses Feb 20 04:58:33 localhost dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host Feb 20 04:58:33 localhost dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts Feb 20 04:58:33 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:33.626 264355 INFO neutron.agent.dhcp.agent [None req-a608acbe-e3c7-4374-9dec-33e2310b4308 - - - - - -] DHCP configuration for ports {'536b2372-3fbe-40de-9c4f-f7c748ef2941'} is completed#033[00m Feb 20 04:58:33 localhost nova_compute[281288]: 2026-02-20 09:58:33.979 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:58:34 localhost podman[319915]: 2026-02-20 09:58:34.147855619 +0000 UTC m=+0.087026033 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, version=9.7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git) Feb 20 04:58:34 localhost podman[319915]: 2026-02-20 09:58:34.164024322 +0000 UTC m=+0.103194786 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public) Feb 20 04:58:34 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:58:34 localhost systemd[1]: tmp-crun.NWw8Wy.mount: Deactivated successfully. Feb 20 04:58:34 localhost dnsmasq[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses Feb 20 04:58:34 localhost dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host Feb 20 04:58:34 localhost dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts Feb 20 04:58:34 localhost podman[319952]: 2026-02-20 09:58:34.701769786 +0000 UTC m=+0.071307843 container kill 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:34 localhost nova_compute[281288]: 2026-02-20 09:58:34.982 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:35 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:35.073 264355 INFO neutron.agent.dhcp.agent [None req-4359195f-f41d-48b2-88d8-b018c7a86e8d - - - - - -] DHCP configuration for ports {'536b2372-3fbe-40de-9c4f-f7c748ef2941', '75e94885-97d0-4be1-9ebe-cfa150020c4f'} is completed#033[00m Feb 20 04:58:35 localhost dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 0 addresses Feb 20 04:58:35 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host Feb 20 04:58:35 localhost dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts Feb 20 04:58:35 localhost podman[319990]: 2026-02-20 09:58:35.395623927 +0000 UTC m=+0.038675320 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:58:35 localhost ovn_controller[156798]: 2026-02-20T09:58:35Z|00372|binding|INFO|Releasing lport b3eab012-7666-4655-9b51-e4c7e9621497 from this chassis (sb_readonly=0) Feb 20 04:58:35 localhost ovn_controller[156798]: 2026-02-20T09:58:35Z|00373|binding|INFO|Setting lport b3eab012-7666-4655-9b51-e4c7e9621497 down in Southbound Feb 20 04:58:35 localhost nova_compute[281288]: 2026-02-20 09:58:35.800 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:35 localhost kernel: device tapb3eab012-76 left promiscuous mode Feb 20 04:58:35 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:35.809 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0e47cd34a784cbb89cbe56eafed5650', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f70c52-a5b6-4d06-85ef-b3fdc5aa9c4b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b3eab012-7666-4655-9b51-e4c7e9621497) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:35 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:35.811 162652 INFO neutron.agent.ovn.metadata.agent [-] Port b3eab012-7666-4655-9b51-e4c7e9621497 in datapath 93dbe9b6-4551-4902-9476-0f2070facdb5 unbound from our chassis#033[00m Feb 20 04:58:35 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:35.816 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 93dbe9b6-4551-4902-9476-0f2070facdb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:35 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:35.817 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[e11353cb-063d-479d-8b9a-caf738a3c9d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:35 localhost nova_compute[281288]: 2026-02-20 09:58:35.822 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:35 localhost nova_compute[281288]: 2026-02-20 09:58:35.823 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:36 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:58:36 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:58:36 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:58:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:36.356 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 520cd37f-c614-4b43-98b3-c526073f80d7 with type ""#033[00m Feb 20 04:58:36 localhost ovn_controller[156798]: 2026-02-20T09:58:36Z|00374|binding|INFO|Removing iface tap75e94885-97 ovn-installed in OVS Feb 20 04:58:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:36.358 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=75e94885-97d0-4be1-9ebe-cfa150020c4f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:36 localhost ovn_controller[156798]: 2026-02-20T09:58:36Z|00375|binding|INFO|Removing lport 75e94885-97d0-4be1-9ebe-cfa150020c4f ovn-installed in OVS Feb 20 04:58:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:36.359 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 75e94885-97d0-4be1-9ebe-cfa150020c4f in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b unbound from our chassis#033[00m Feb 20 04:58:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:36.362 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01006bb5-6e96-485f-99d6-c3f27965c51b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:36 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:36.399 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[af81aeee-c425-42ac-9265-a08ece9342c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:36 localhost nova_compute[281288]: 2026-02-20 09:58:36.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:36 localhost nova_compute[281288]: 2026-02-20 09:58:36.402 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:36 localhost dnsmasq[319914]: exiting on receipt of SIGTERM Feb 20 04:58:36 localhost podman[320028]: 2026-02-20 09:58:36.475440956 +0000 UTC m=+0.055943505 container kill 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 04:58:36 localhost systemd[1]: libpod-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646.scope: Deactivated successfully. Feb 20 04:58:36 localhost podman[320042]: 2026-02-20 09:58:36.540257701 +0000 UTC m=+0.052551672 container died 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:36 localhost systemd[1]: tmp-crun.MgrJx2.mount: Deactivated successfully. Feb 20 04:58:36 localhost podman[320042]: 2026-02-20 09:58:36.581931761 +0000 UTC m=+0.094225692 container cleanup 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 20 04:58:36 localhost systemd[1]: libpod-conmon-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646.scope: Deactivated successfully. Feb 20 04:58:36 localhost podman[320044]: 2026-02-20 09:58:36.605966133 +0000 UTC m=+0.110020382 container remove 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:58:36 localhost nova_compute[281288]: 2026-02-20 09:58:36.617 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:36 localhost kernel: device tap75e94885-97 left promiscuous mode Feb 20 04:58:36 localhost nova_compute[281288]: 2026-02-20 09:58:36.636 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:36.654 264355 INFO neutron.agent.dhcp.agent [None req-e455f8f0-087d-4011-b3b1-264293719a26 - - - - - -] Synchronizing state#033[00m Feb 20 04:58:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:36.801 264355 INFO neutron.agent.dhcp.agent [None req-8d45ea50-451f-40e0-9662-1427715115cb - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 20 04:58:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:36.802 264355 INFO neutron.agent.dhcp.agent [-] Starting network 01006bb5-6e96-485f-99d6-c3f27965c51b dhcp configuration#033[00m Feb 20 04:58:37 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:37.324 264355 INFO neutron.agent.linux.ip_lib [None req-f6cba991-fb9f-4c51-a0ee-e863ba004ca3 - - - - - -] Device tap44cb708b-a5 cannot be used as it has no MAC address#033[00m Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.351 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost kernel: device tap44cb708b-a5 entered promiscuous mode Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost NetworkManager[5988]: [1771581517.3590] manager: (tap44cb708b-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Feb 20 04:58:37 localhost ovn_controller[156798]: 2026-02-20T09:58:37Z|00376|binding|INFO|Claiming lport 44cb708b-a522-48fc-9798-157cbbbe1988 for this chassis. Feb 20 04:58:37 localhost ovn_controller[156798]: 2026-02-20T09:58:37Z|00377|binding|INFO|44cb708b-a522-48fc-9798-157cbbbe1988: Claiming unknown Feb 20 04:58:37 localhost systemd-udevd[320078]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:37 localhost ovn_controller[156798]: 2026-02-20T09:58:37Z|00378|binding|INFO|Setting lport 44cb708b-a522-48fc-9798-157cbbbe1988 ovn-installed in OVS Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.367 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost ovn_controller[156798]: 2026-02-20T09:58:37Z|00379|binding|INFO|Setting lport 44cb708b-a522-48fc-9798-157cbbbe1988 up in Southbound Feb 20 04:58:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:37.371 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=44cb708b-a522-48fc-9798-157cbbbe1988) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.370 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:37.375 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 44cb708b-a522-48fc-9798-157cbbbe1988 in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b bound to our chassis#033[00m Feb 20 04:58:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:37.379 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 004e0c9d-465d-4bd4-9c9e-d4787ee58d30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:58:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:37.379 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01006bb5-6e96-485f-99d6-c3f27965c51b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:37 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:37.381 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[dbea7bad-ef4b-4d35-aba2-d0cfdc548581]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.389 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.427 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost nova_compute[281288]: 2026-02-20 09:58:37.508 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:37 localhost systemd[1]: var-lib-containers-storage-overlay-f21ae561d6365eec38a5d27ea0386b57d32a2312c2b64cc194e30ff4e8fac29e-merged.mount: Deactivated successfully. Feb 20 04:58:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646-userdata-shm.mount: Deactivated successfully. Feb 20 04:58:38 localhost podman[320134]: Feb 20 04:58:38 localhost podman[320134]: 2026-02-20 09:58:38.359187441 +0000 UTC m=+0.093172660 container create e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:38 localhost systemd[1]: Started libpod-conmon-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415.scope. Feb 20 04:58:38 localhost podman[320134]: 2026-02-20 09:58:38.314576872 +0000 UTC m=+0.048562111 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:38 localhost systemd[1]: Started libcrun container. Feb 20 04:58:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3c6ec8dbf39580881a394b21a90590bb4d78083db4873530e20c2b99ab090a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:38 localhost podman[320134]: 2026-02-20 09:58:38.45893414 +0000 UTC m=+0.192919359 container init e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 20 04:58:38 localhost podman[320134]: 2026-02-20 09:58:38.468132441 +0000 UTC m=+0.202117660 container start e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 04:58:38 localhost dnsmasq[320151]: started, version 2.85 cachesize 150 Feb 20 04:58:38 localhost dnsmasq[320151]: DNS service limited to local subnets Feb 20 04:58:38 localhost dnsmasq[320151]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:38 localhost dnsmasq[320151]: warning: no upstream servers configured Feb 20 04:58:38 localhost dnsmasq-dhcp[320151]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:38 localhost dnsmasq[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses Feb 20 04:58:38 localhost dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host Feb 20 04:58:38 localhost dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts Feb 20 04:58:38 localhost systemd[1]: tmp-crun.QgJDG1.mount: Deactivated successfully. Feb 20 04:58:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:38.530 264355 INFO neutron.agent.dhcp.agent [None req-6cc633c4-1491-4b60-a6ee-0e2ab62fa379 - - - - - -] Finished network 01006bb5-6e96-485f-99d6-c3f27965c51b dhcp configuration#033[00m Feb 20 04:58:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:38.530 264355 INFO neutron.agent.dhcp.agent [None req-8d45ea50-451f-40e0-9662-1427715115cb - - - - - -] Synchronizing state complete#033[00m Feb 20 04:58:38 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:38.635 264355 INFO neutron.agent.dhcp.agent [None req-3c5cc925-3a4e-40ed-aa05-dc24ae9cfdb8 - - - - - -] DHCP configuration for ports {'536b2372-3fbe-40de-9c4f-f7c748ef2941'} is completed#033[00m Feb 20 04:58:38 localhost systemd[1]: tmp-crun.qE16Pu.mount: Deactivated successfully. Feb 20 04:58:38 localhost dnsmasq[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses Feb 20 04:58:38 localhost dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host Feb 20 04:58:38 localhost dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts Feb 20 04:58:38 localhost podman[320169]: 2026-02-20 09:58:38.753653059 +0000 UTC m=+0.075168480 container kill e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 04:58:39 localhost dnsmasq[320151]: exiting on receipt of SIGTERM Feb 20 04:58:39 localhost podman[320206]: 2026-02-20 09:58:39.198287606 +0000 UTC m=+0.063355890 container kill e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:58:39 localhost systemd[1]: libpod-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415.scope: Deactivated successfully. Feb 20 04:58:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:58:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 04:58:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 04:58:39 localhost podman[320219]: 2026-02-20 09:58:39.263853025 +0000 UTC m=+0.050567602 container died e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 20 04:58:39 localhost podman[320219]: 2026-02-20 09:58:39.29686312 +0000 UTC m=+0.083577647 container cleanup e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 04:58:39 localhost systemd[1]: libpod-conmon-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415.scope: Deactivated successfully. Feb 20 04:58:39 localhost podman[320220]: 2026-02-20 09:58:39.350085521 +0000 UTC m=+0.129489326 container remove e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:58:39 localhost nova_compute[281288]: 2026-02-20 09:58:39.410 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:39 localhost kernel: device tap44cb708b-a5 left promiscuous mode Feb 20 04:58:39 localhost ovn_controller[156798]: 2026-02-20T09:58:39Z|00380|binding|INFO|Releasing lport 44cb708b-a522-48fc-9798-157cbbbe1988 from this chassis (sb_readonly=0) Feb 20 04:58:39 localhost ovn_controller[156798]: 2026-02-20T09:58:39Z|00381|binding|INFO|Setting lport 44cb708b-a522-48fc-9798-157cbbbe1988 down in Southbound Feb 20 04:58:39 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:39.418 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=44cb708b-a522-48fc-9798-157cbbbe1988) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:39 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:39.420 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 44cb708b-a522-48fc-9798-157cbbbe1988 in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b unbound from our chassis#033[00m Feb 20 04:58:39 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:39.421 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 01006bb5-6e96-485f-99d6-c3f27965c51b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:58:39 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:39.423 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d2a1cb-eee2-4fb1-af72-6accf4f9e6ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:39 localhost nova_compute[281288]: 2026-02-20 09:58:39.434 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:39 localhost systemd[1]: var-lib-containers-storage-overlay-f3c6ec8dbf39580881a394b21a90590bb4d78083db4873530e20c2b99ab090a3-merged.mount: Deactivated successfully. Feb 20 04:58:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415-userdata-shm.mount: Deactivated successfully. Feb 20 04:58:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:39.646 264355 INFO neutron.agent.dhcp.agent [None req-be451e73-74db-4370-b7ab-24915ca4d317 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:58:39 localhost systemd[1]: run-netns-qdhcp\x2d01006bb5\x2d6e96\x2d485f\x2d99d6\x2dc3f27965c51b.mount: Deactivated successfully. Feb 20 04:58:39 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:39.649 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:58:39 localhost nova_compute[281288]: 2026-02-20 09:58:39.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:58:40 localhost ovn_controller[156798]: 2026-02-20T09:58:40Z|00382|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:58:40 localhost nova_compute[281288]: 2026-02-20 09:58:40.120 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:40 localhost podman[320249]: 2026-02-20 09:58:40.157047247 +0000 UTC m=+0.092761607 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Feb 20 04:58:40 localhost podman[320249]: 2026-02-20 09:58:40.169072994 +0000 UTC m=+0.104787324 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 04:58:40 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:58:40 localhost systemd[1]: tmp-crun.14o06S.mount: Deactivated successfully. Feb 20 04:58:40 localhost podman[320248]: 2026-02-20 09:58:40.230885618 +0000 UTC m=+0.168018631 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:58:40 localhost podman[320248]: 2026-02-20 09:58:40.275385884 +0000 UTC m=+0.212518907 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:40 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:58:41 localhost ovn_controller[156798]: 2026-02-20T09:58:41Z|00383|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:58:41 localhost nova_compute[281288]: 2026-02-20 09:58:41.264 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:41 localhost nova_compute[281288]: 2026-02-20 09:58:41.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:58:42 localhost nova_compute[281288]: 2026-02-20 09:58:42.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:42 localhost systemd[1]: tmp-crun.SxBFXX.mount: Deactivated successfully. Feb 20 04:58:42 localhost dnsmasq[319067]: exiting on receipt of SIGTERM Feb 20 04:58:42 localhost podman[320307]: 2026-02-20 09:58:42.461523639 +0000 UTC m=+0.105487575 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 20 04:58:42 localhost systemd[1]: libpod-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb.scope: Deactivated successfully. Feb 20 04:58:42 localhost podman[320323]: 2026-02-20 09:58:42.550570973 +0000 UTC m=+0.063466445 container died fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:58:42 localhost systemd[1]: tmp-crun.XaDAgD.mount: Deactivated successfully. Feb 20 04:58:42 localhost podman[320323]: 2026-02-20 09:58:42.605981851 +0000 UTC m=+0.118877273 container remove fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:58:42 localhost systemd[1]: libpod-conmon-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb.scope: Deactivated successfully. Feb 20 04:58:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:42.852 264355 INFO neutron.agent.dhcp.agent [None req-caf08bf9-d1c1-4d8c-aacf-3246e5ef4799 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:58:42 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:42.853 264355 INFO neutron.agent.dhcp.agent [None req-caf08bf9-d1c1-4d8c-aacf-3246e5ef4799 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:58:43 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:43.321 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:58:43 localhost systemd[1]: var-lib-containers-storage-overlay-6954e2904a7005976441dfd44549da912216319c18455669f4ef3b3e79ca01de-merged.mount: Deactivated successfully. Feb 20 04:58:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb-userdata-shm.mount: Deactivated successfully. Feb 20 04:58:43 localhost systemd[1]: run-netns-qdhcp\x2d93dbe9b6\x2d4551\x2d4902\x2d9476\x2d0f2070facdb5.mount: Deactivated successfully. Feb 20 04:58:44 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:58:44 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:58:44 localhost podman[320457]: 2026-02-20 09:58:44.16921146 +0000 UTC m=+0.115200891 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, maintainer=Guillaume Abrioux , distribution-scope=public, io.buildah.version=1.42.2, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7) Feb 20 04:58:44 localhost podman[320457]: 2026-02-20 09:58:44.309269137 +0000 UTC m=+0.255258608 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, version=7, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux ) Feb 20 04:58:44 localhost nova_compute[281288]: 2026-02-20 09:58:44.989 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:58:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:47 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M Feb 20 04:58:47 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M Feb 20 04:58:47 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:58:47 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:58:47 localhost ceph-mon[301857]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M Feb 20 04:58:47 localhost ceph-mon[301857]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 20 04:58:47 localhost nova_compute[281288]: 2026-02-20 09:58:47.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:47 localhost podman[241968]: time="2026-02-20T09:58:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:58:47 localhost podman[241968]: @ - - [20/Feb/2026:09:58:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:58:47 localhost podman[241968]: @ - - [20/Feb/2026:09:58:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18361 "" "Go-http-client/1.1" Feb 20 04:58:47 localhost ovn_controller[156798]: 2026-02-20T09:58:47Z|00384|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:58:47 localhost nova_compute[281288]: 2026-02-20 09:58:47.990 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:58:48 localhost podman[320666]: 2026-02-20 09:58:48.166886623 +0000 UTC m=+0.097267625 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 04:58:48 localhost podman[320666]: 2026-02-20 09:58:48.177159175 +0000 UTC m=+0.107540147 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 04:58:48 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:58:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:58:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:58:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:58:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:58:49 localhost nova_compute[281288]: 2026-02-20 09:58:49.992 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:51 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:51.883 264355 INFO neutron.agent.linux.ip_lib [None req-65f50249-de55-4a89-86ea-a1b7d8e9557e - - - - - -] Device taped429b7d-6e cannot be used as it has no MAC address#033[00m Feb 20 04:58:51 localhost nova_compute[281288]: 2026-02-20 09:58:51.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:51 localhost kernel: device taped429b7d-6e entered promiscuous mode Feb 20 04:58:51 localhost nova_compute[281288]: 2026-02-20 09:58:51.913 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:51 localhost NetworkManager[5988]: [1771581531.9140] manager: (taped429b7d-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Feb 20 04:58:51 localhost ovn_controller[156798]: 2026-02-20T09:58:51Z|00385|binding|INFO|Claiming lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 for this chassis. Feb 20 04:58:51 localhost ovn_controller[156798]: 2026-02-20T09:58:51Z|00386|binding|INFO|ed429b7d-6ede-437c-a873-d5a788bbc1e3: Claiming unknown Feb 20 04:58:51 localhost systemd-udevd[320695]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:51.934 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5dde04c-263c-492d-a9af-a31ca9074a96, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ed429b7d-6ede-437c-a873-d5a788bbc1e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:51.936 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ed429b7d-6ede-437c-a873-d5a788bbc1e3 in datapath fc30869a-497f-4b61-b96d-28cefb439c42 bound to our chassis#033[00m Feb 20 04:58:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:51.939 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 49b3a0fb-cff4-4472-ae6a-4c0aa102af48 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:58:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:51.939 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc30869a-497f-4b61-b96d-28cefb439c42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:51 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:51.942 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9853aed1-e05c-4bae-9806-1a41d57ae3e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:51 localhost ovn_controller[156798]: 2026-02-20T09:58:51Z|00387|binding|INFO|Setting lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 ovn-installed in OVS Feb 20 04:58:51 localhost ovn_controller[156798]: 2026-02-20T09:58:51Z|00388|binding|INFO|Setting lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 up in Southbound Feb 20 04:58:51 localhost nova_compute[281288]: 2026-02-20 09:58:51.955 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:51 localhost nova_compute[281288]: 2026-02-20 09:58:51.997 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:52 localhost nova_compute[281288]: 2026-02-20 09:58:52.031 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:52 localhost podman[320748]: Feb 20 04:58:52 localhost podman[320748]: 2026-02-20 09:58:52.920565018 +0000 UTC m=+0.093884652 container create fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:58:52 localhost sshd[320761]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:52 localhost systemd[1]: Started libpod-conmon-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854.scope. Feb 20 04:58:52 localhost podman[320748]: 2026-02-20 09:58:52.876456263 +0000 UTC m=+0.049775867 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:52 localhost systemd[1]: Started libcrun container. Feb 20 04:58:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0c41e1a106862f8d161432c72fe9494234bced6572d34f77ad98b154cf87b9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:53 localhost podman[320748]: 2026-02-20 09:58:53.002220185 +0000 UTC m=+0.175539789 container init fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:58:53 localhost podman[320748]: 2026-02-20 09:58:53.013290143 +0000 UTC m=+0.186609747 container start fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 20 04:58:53 localhost dnsmasq[320767]: started, version 2.85 cachesize 150 Feb 20 04:58:53 localhost dnsmasq[320767]: DNS service limited to local subnets Feb 20 04:58:53 localhost dnsmasq[320767]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:53 localhost dnsmasq[320767]: warning: no upstream servers configured Feb 20 04:58:53 localhost dnsmasq-dhcp[320767]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:53 localhost dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 0 addresses Feb 20 04:58:53 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host Feb 20 04:58:53 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts Feb 20 04:58:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:58:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 04:58:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 04:58:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:53.584 264355 INFO neutron.agent.dhcp.agent [None req-8fc9c3fc-284b-447f-a2b0-22674525fb40 - - - - - -] DHCP configuration for ports {'d398f4e9-922d-4f2f-be79-41ec31c42aab'} is completed#033[00m Feb 20 04:58:55 localhost nova_compute[281288]: 2026-02-20 09:58:55.037 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:55 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 20 04:58:55 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/438591704' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 20 04:58:56 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:56.156 264355 INFO neutron.agent.linux.ip_lib [None req-6fbd4e58-3077-4d00-af5b-1e6e04439f73 - - - - - -] Device tapca71dfc6-5b cannot be used as it has no MAC address#033[00m Feb 20 04:58:56 localhost nova_compute[281288]: 2026-02-20 09:58:56.238 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:56 localhost kernel: device tapca71dfc6-5b entered promiscuous mode Feb 20 04:58:56 localhost NetworkManager[5988]: [1771581536.2452] manager: (tapca71dfc6-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Feb 20 04:58:56 localhost ovn_controller[156798]: 2026-02-20T09:58:56Z|00389|binding|INFO|Claiming lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 for this chassis. Feb 20 04:58:56 localhost ovn_controller[156798]: 2026-02-20T09:58:56Z|00390|binding|INFO|ca71dfc6-5b76-44e7-a509-dbdd64a83fd3: Claiming unknown Feb 20 04:58:56 localhost nova_compute[281288]: 2026-02-20 09:58:56.246 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:56 localhost systemd-udevd[320779]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:58:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:56.265 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02320057-bc72-4d8a-838a-f4d7286b15bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca71dfc6-5b76-44e7-a509-dbdd64a83fd3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:56.267 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 in datapath 31019f31-c68c-481a-9b72-3317c35499b9 bound to our chassis#033[00m Feb 20 04:58:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:56.273 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 892814dd-0933-4ec3-939e-24147c680731 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 04:58:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:56.273 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31019f31-c68c-481a-9b72-3317c35499b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:58:56 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:56.275 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[999f7548-3b27-42f1-9704-1af328891dc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:58:56 localhost nova_compute[281288]: 2026-02-20 09:58:56.285 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:56 localhost ovn_controller[156798]: 2026-02-20T09:58:56Z|00391|binding|INFO|Setting lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 ovn-installed in OVS Feb 20 04:58:56 localhost ovn_controller[156798]: 2026-02-20T09:58:56Z|00392|binding|INFO|Setting lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 up in Southbound Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost nova_compute[281288]: 2026-02-20 09:58:56.290 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost journal[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device Feb 20 04:58:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:58:56 localhost nova_compute[281288]: 2026-02-20 09:58:56.331 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:58:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:58:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:58:56 localhost nova_compute[281288]: 2026-02-20 09:58:56.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:58:56 localhost podman[320786]: 2026-02-20 09:58:56.399819223 +0000 UTC m=+0.102211796 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:58:56 localhost podman[320786]: 2026-02-20 09:58:56.408993792 +0000 UTC m=+0.111386355 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:58:56 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:58:56 localhost openstack_network_exporter[244414]: ERROR 09:58:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:58:56 localhost openstack_network_exporter[244414]: Feb 20 04:58:56 localhost openstack_network_exporter[244414]: ERROR 09:58:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:58:56 localhost openstack_network_exporter[244414]: Feb 20 04:58:57 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e199 e199: 6 total, 6 up, 6 in Feb 20 04:58:57 localhost podman[320872]: Feb 20 04:58:57 localhost podman[320872]: 2026-02-20 09:58:57.395434547 +0000 UTC m=+0.123928097 container create 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 04:58:57 localhost systemd[1]: Started libpod-conmon-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674.scope. Feb 20 04:58:57 localhost podman[320872]: 2026-02-20 09:58:57.34599048 +0000 UTC m=+0.074484070 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 04:58:57 localhost systemd[1]: tmp-crun.qzkYdY.mount: Deactivated successfully. Feb 20 04:58:57 localhost systemd[1]: Started libcrun container. Feb 20 04:58:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/095be6d19a7c831778435ebe06a1dd1952a6d636d2c8437ba888c4b45c7d81ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 04:58:57 localhost podman[320872]: 2026-02-20 09:58:57.490294907 +0000 UTC m=+0.218788457 container init 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Feb 20 04:58:57 localhost podman[320872]: 2026-02-20 09:58:57.500147467 +0000 UTC m=+0.228641017 container start 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:58:57 localhost dnsmasq[320889]: started, version 2.85 cachesize 150 Feb 20 04:58:57 localhost dnsmasq[320889]: DNS service limited to local subnets Feb 20 04:58:57 localhost dnsmasq[320889]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 04:58:57 localhost dnsmasq[320889]: warning: no upstream servers configured Feb 20 04:58:57 localhost dnsmasq-dhcp[320889]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 04:58:57 localhost dnsmasq[320889]: read /var/lib/neutron/dhcp/31019f31-c68c-481a-9b72-3317c35499b9/addn_hosts - 0 addresses Feb 20 04:58:57 localhost dnsmasq-dhcp[320889]: read /var/lib/neutron/dhcp/31019f31-c68c-481a-9b72-3317c35499b9/host Feb 20 04:58:57 localhost dnsmasq-dhcp[320889]: read /var/lib/neutron/dhcp/31019f31-c68c-481a-9b72-3317c35499b9/opts Feb 20 04:58:57 localhost sshd[320890]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:57 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:58:57.688 264355 INFO neutron.agent.dhcp.agent [None req-e4fbba05-c9f4-4cb2-88d4-e6adeed73622 - - - - - -] DHCP configuration for ports {'d3b0281c-a10c-40f6-ae69-d8e1ee2004d8'} is completed#033[00m Feb 20 04:58:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e200 e200: 6 total, 6 up, 6 in Feb 20 04:58:58 localhost sshd[320892]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:58:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:58:58 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1250733315' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:58:58 localhost nova_compute[281288]: 2026-02-20 09:58:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:58 localhost nova_compute[281288]: 2026-02-20 09:58:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:58:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:59.052 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:58:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:58:59.054 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:58:59 localhost nova_compute[281288]: 2026-02-20 09:58:59.087 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:00 localhost nova_compute[281288]: 2026-02-20 09:59:00.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:00 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:59:00 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 04:59:00 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.435339) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540435377, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2816, "num_deletes": 266, "total_data_size": 4226566, "memory_usage": 4292672, "flush_reason": "Manual Compaction"} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540448904, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2755961, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22783, "largest_seqno": 25594, "table_properties": {"data_size": 2744524, "index_size": 7302, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27174, "raw_average_key_size": 22, "raw_value_size": 2720547, "raw_average_value_size": 2257, "num_data_blocks": 307, "num_entries": 1205, "num_filter_entries": 1205, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581414, "oldest_key_time": 1771581414, "file_creation_time": 1771581540, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 13876 microseconds, and 7157 cpu microseconds. Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.449211) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2755961 bytes OK Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.449333) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.451589) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.451614) EVENT_LOG_v1 {"time_micros": 1771581540451608, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.451671) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 4213392, prev total WAL file size 4213392, number of live WAL files 2. Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.453403) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2691KB)], [33(17MB)] Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540453460, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 20730407, "oldest_snapshot_seqno": -1} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 13383 keys, 19529775 bytes, temperature: kUnknown Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540542072, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 19529775, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19449742, "index_size": 45500, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33477, "raw_key_size": 356165, "raw_average_key_size": 26, "raw_value_size": 19218584, "raw_average_value_size": 1436, "num_data_blocks": 1741, "num_entries": 13383, "num_filter_entries": 13383, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581540, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.542434) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 19529775 bytes Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.544122) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 233.7 rd, 220.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 17.1 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(14.6) write-amplify(7.1) OK, records in: 13936, records dropped: 553 output_compression: NoCompression Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.544152) EVENT_LOG_v1 {"time_micros": 1771581540544139, "job": 18, "event": "compaction_finished", "compaction_time_micros": 88723, "compaction_time_cpu_micros": 54434, "output_level": 6, "num_output_files": 1, "total_output_size": 19529775, "num_input_records": 13936, "num_output_records": 13383, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540544677, "job": 18, "event": "table_file_deletion", "file_number": 35} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540547753, "job": 18, "event": "table_file_deletion", "file_number": 33} Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.453283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547795) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:00 localhost nova_compute[281288]: 2026-02-20 09:59:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 20 04:59:01 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/691707941' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 20 04:59:01 localhost nova_compute[281288]: 2026-02-20 09:59:01.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:01 localhost nova_compute[281288]: 2026-02-20 09:59:01.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:59:01 localhost nova_compute[281288]: 2026-02-20 09:59:01.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:59:01 localhost nova_compute[281288]: 2026-02-20 09:59:01.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:59:01 localhost nova_compute[281288]: 2026-02-20 09:59:01.744 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 04:59:01 localhost nova_compute[281288]: 2026-02-20 09:59:01.744 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:59:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:59:02 localhost podman[320914]: 2026-02-20 09:59:02.135705434 +0000 UTC m=+0.072334085 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:59:02 localhost podman[320914]: 2026-02-20 09:59:02.148492585 +0000 UTC m=+0.085121216 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 04:59:02 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:02.151 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:59:01Z, description=, device_id=95ba958f-a3ec-4f5b-8859-f347b6468462, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=55476999-5229-4851-a7ff-9e5f33ce284f, ip_allocation=immediate, mac_address=fa:16:3e:16:8a:b8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:49Z, description=, dns_domain=, id=fc30869a-497f-4b61-b96d-28cefb439c42, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-129517273, port_security_enabled=True, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16743, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3039, status=ACTIVE, subnets=['5feacf59-4769-4952-9d4b-54ca0cbd92a2'], tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:58:50Z, vlan_transparent=None, network_id=fc30869a-497f-4b61-b96d-28cefb439c42, port_security_enabled=False, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3078, status=DOWN, tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:59:01Z on network fc30869a-497f-4b61-b96d-28cefb439c42#033[00m Feb 20 04:59:02 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:59:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:59:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/909841881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.183 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 04:59:02 localhost dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 1 addresses Feb 20 04:59:02 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host Feb 20 04:59:02 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts Feb 20 04:59:02 localhost systemd[1]: tmp-crun.y2vRKr.mount: Deactivated successfully. Feb 20 04:59:02 localhost podman[320955]: 2026-02-20 09:59:02.429590509 +0000 UTC m=+0.085867718 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.524 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.526 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11268MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.527 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.527 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.773 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.773 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.774 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 04:59:02 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:02.861 264355 INFO neutron.agent.dhcp.agent [None req-faff82a9-e1fb-49d1-99ad-0350d66adec5 - - - - - -] DHCP configuration for ports {'55476999-5229-4851-a7ff-9e5f33ce284f'} is completed#033[00m Feb 20 04:59:02 localhost nova_compute[281288]: 2026-02-20 09:59:02.871 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.003723) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543003773, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 297, "num_deletes": 256, "total_data_size": 82012, "memory_usage": 89232, "flush_reason": "Manual Compaction"} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543006540, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 53441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25599, "largest_seqno": 25891, "table_properties": {"data_size": 51528, "index_size": 152, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4865, "raw_average_key_size": 17, "raw_value_size": 47694, "raw_average_value_size": 172, "num_data_blocks": 7, "num_entries": 277, "num_filter_entries": 277, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581540, "oldest_key_time": 1771581540, "file_creation_time": 1771581543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 2840 microseconds, and 743 cpu microseconds. Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.006570) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 53441 bytes OK Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.006586) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.008299) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.008317) EVENT_LOG_v1 {"time_micros": 1771581543008313, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.008339) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 79797, prev total WAL file size 79797, number of live WAL files 2. Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.012725) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323733' seq:0, type:0; will stop at (end) Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(52KB)], [36(18MB)] Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543012828, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19583216, "oldest_snapshot_seqno": -1} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 13141 keys, 18931300 bytes, temperature: kUnknown Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543087756, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 18931300, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18854102, "index_size": 43241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32901, "raw_key_size": 352094, "raw_average_key_size": 26, "raw_value_size": 18628400, "raw_average_value_size": 1417, "num_data_blocks": 1636, "num_entries": 13141, "num_filter_entries": 13141, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.088204) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 18931300 bytes Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.090335) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 260.9 rd, 252.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.6 +0.0 blob) out(18.1 +0.0 blob), read-write-amplify(720.7) write-amplify(354.2) OK, records in: 13660, records dropped: 519 output_compression: NoCompression Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.090365) EVENT_LOG_v1 {"time_micros": 1771581543090352, "job": 20, "event": "compaction_finished", "compaction_time_micros": 75065, "compaction_time_cpu_micros": 41969, "output_level": 6, "num_output_files": 1, "total_output_size": 18931300, "num_input_records": 13660, "num_output_records": 13141, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543090526, "job": 20, "event": "table_file_deletion", "file_number": 38} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543093206, "job": 20, "event": "table_file_deletion", "file_number": 36} Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.012553) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093288) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 04:59:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 04:59:03 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4235731053' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 04:59:03 localhost nova_compute[281288]: 2026-02-20 09:59:03.362 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 04:59:03 localhost nova_compute[281288]: 2026-02-20 09:59:03.370 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 04:59:03 localhost nova_compute[281288]: 2026-02-20 09:59:03.384 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 04:59:03 localhost nova_compute[281288]: 2026-02-20 09:59:03.387 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 04:59:03 localhost nova_compute[281288]: 2026-02-20 09:59:03.387 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:59:03 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:03.663 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:59:01Z, description=, device_id=95ba958f-a3ec-4f5b-8859-f347b6468462, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=55476999-5229-4851-a7ff-9e5f33ce284f, ip_allocation=immediate, mac_address=fa:16:3e:16:8a:b8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:49Z, description=, dns_domain=, id=fc30869a-497f-4b61-b96d-28cefb439c42, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-129517273, port_security_enabled=True, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16743, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3039, status=ACTIVE, subnets=['5feacf59-4769-4952-9d4b-54ca0cbd92a2'], tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:58:50Z, vlan_transparent=None, network_id=fc30869a-497f-4b61-b96d-28cefb439c42, port_security_enabled=False, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3078, status=DOWN, tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:59:01Z on network fc30869a-497f-4b61-b96d-28cefb439c42#033[00m Feb 20 04:59:03 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:03 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:03 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:03 localhost systemd[1]: tmp-crun.e2PzWa.mount: Deactivated successfully. Feb 20 04:59:03 localhost podman[321015]: 2026-02-20 09:59:03.923169545 +0000 UTC m=+0.082715640 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:59:03 localhost dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 1 addresses Feb 20 04:59:03 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host Feb 20 04:59:03 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts Feb 20 04:59:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:04.055 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 04:59:04 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:04.264 264355 INFO neutron.agent.dhcp.agent [None req-7b843224-f6b6-45c3-8534-8c2e6b37671c - - - - - -] DHCP configuration for ports {'55476999-5229-4851-a7ff-9e5f33ce284f'} is completed#033[00m Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.388 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.389 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 04:59:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e201 e201: 6 total, 6 up, 6 in Feb 20 04:59:04 localhost podman[321053]: 2026-02-20 09:59:04.792360967 +0000 UTC m=+0.076701537 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:59:04 localhost dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 0 addresses Feb 20 04:59:04 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host Feb 20 04:59:04 localhost dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts Feb 20 04:59:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:59:04 localhost podman[321066]: 2026-02-20 09:59:04.915686635 +0000 UTC m=+0.090831998 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1770267347, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 20 04:59:04 localhost podman[321066]: 2026-02-20 09:59:04.933964902 +0000 UTC m=+0.109110315 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter) Feb 20 04:59:04 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.973 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:04 localhost ovn_controller[156798]: 2026-02-20T09:59:04Z|00393|binding|INFO|Releasing lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 from this chassis (sb_readonly=0) Feb 20 04:59:04 localhost ovn_controller[156798]: 2026-02-20T09:59:04Z|00394|binding|INFO|Setting lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 down in Southbound Feb 20 04:59:04 localhost kernel: device taped429b7d-6e left promiscuous mode Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.984 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:04.984 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5dde04c-263c-492d-a9af-a31ca9074a96, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ed429b7d-6ede-437c-a873-d5a788bbc1e3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:59:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:04.987 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ed429b7d-6ede-437c-a873-d5a788bbc1e3 in datapath fc30869a-497f-4b61-b96d-28cefb439c42 unbound from our chassis#033[00m Feb 20 04:59:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:04.990 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc30869a-497f-4b61-b96d-28cefb439c42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:59:04 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:04.991 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[0d63e503-abc3-4237-9c13-376afcfe3e80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:59:04 localhost nova_compute[281288]: 2026-02-20 09:59:04.999 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:05 localhost nova_compute[281288]: 2026-02-20 09:59:05.042 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:05 localhost nova_compute[281288]: 2026-02-20 09:59:05.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.022 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 04:59:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:06 localhost dnsmasq[320889]: exiting on receipt of SIGTERM Feb 20 04:59:06 localhost podman[321111]: 2026-02-20 09:59:06.450889019 +0000 UTC m=+0.058594746 container kill 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:59:06 localhost systemd[1]: libpod-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674.scope: Deactivated successfully. Feb 20 04:59:06 localhost podman[321126]: 2026-02-20 09:59:06.52472164 +0000 UTC m=+0.056121351 container died 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:59:06 localhost systemd[1]: tmp-crun.aRUac6.mount: Deactivated successfully. Feb 20 04:59:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674-userdata-shm.mount: Deactivated successfully. Feb 20 04:59:06 localhost podman[321126]: 2026-02-20 09:59:06.558987433 +0000 UTC m=+0.090387064 container cleanup 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:59:06 localhost systemd[1]: libpod-conmon-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674.scope: Deactivated successfully. Feb 20 04:59:06 localhost podman[321127]: 2026-02-20 09:59:06.596794605 +0000 UTC m=+0.124514104 container remove 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:06 localhost kernel: device tapca71dfc6-5b left promiscuous mode Feb 20 04:59:06 localhost ovn_controller[156798]: 2026-02-20T09:59:06Z|00395|binding|INFO|Releasing lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 from this chassis (sb_readonly=0) Feb 20 04:59:06 localhost ovn_controller[156798]: 2026-02-20T09:59:06Z|00396|binding|INFO|Setting lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 down in Southbound Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.622 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02320057-bc72-4d8a-838a-f4d7286b15bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca71dfc6-5b76-44e7-a509-dbdd64a83fd3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.624 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 in datapath 31019f31-c68c-481a-9b72-3317c35499b9 unbound from our chassis#033[00m Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.627 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31019f31-c68c-481a-9b72-3317c35499b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 04:59:06 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:06.628 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[90776264-ab6d-413a-9032-3575d7906376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.633 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.801 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.802 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.802 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 04:59:06 localhost nova_compute[281288]: 2026-02-20 09:59:06.803 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 04:59:06 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:06.874 264355 INFO neutron.agent.dhcp.agent [None req-17d2ed3d-d760-42a6-8ec9-5ce820d2cafd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:59:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:07.000 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:59:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:07.235 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:59:07 localhost nova_compute[281288]: 2026-02-20 09:59:07.401 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 04:59:07 localhost nova_compute[281288]: 2026-02-20 09:59:07.416 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 04:59:07 localhost nova_compute[281288]: 2026-02-20 09:59:07.417 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 04:59:07 localhost ovn_controller[156798]: 2026-02-20T09:59:07Z|00397|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:59:07 localhost systemd[1]: var-lib-containers-storage-overlay-095be6d19a7c831778435ebe06a1dd1952a6d636d2c8437ba888c4b45c7d81ce-merged.mount: Deactivated successfully. Feb 20 04:59:07 localhost systemd[1]: run-netns-qdhcp\x2d31019f31\x2dc68c\x2d481a\x2d9b72\x2d3317c35499b9.mount: Deactivated successfully. Feb 20 04:59:07 localhost nova_compute[281288]: 2026-02-20 09:59:07.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:08 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:08 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 04:59:08 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 04:59:08 localhost dnsmasq[320767]: exiting on receipt of SIGTERM Feb 20 04:59:08 localhost podman[321172]: 2026-02-20 09:59:08.335696676 +0000 UTC m=+0.063946769 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 20 04:59:08 localhost systemd[1]: libpod-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854.scope: Deactivated successfully. Feb 20 04:59:08 localhost podman[321184]: 2026-02-20 09:59:08.397954963 +0000 UTC m=+0.046681643 container died fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 04:59:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854-userdata-shm.mount: Deactivated successfully. Feb 20 04:59:08 localhost systemd[1]: var-lib-containers-storage-overlay-e0c41e1a106862f8d161432c72fe9494234bced6572d34f77ad98b154cf87b9a-merged.mount: Deactivated successfully. Feb 20 04:59:08 localhost podman[321184]: 2026-02-20 09:59:08.483776638 +0000 UTC m=+0.132503318 container cleanup fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:59:08 localhost systemd[1]: libpod-conmon-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854.scope: Deactivated successfully. Feb 20 04:59:08 localhost podman[321186]: 2026-02-20 09:59:08.505461009 +0000 UTC m=+0.146996080 container remove fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:59:08 localhost systemd[1]: run-netns-qdhcp\x2dfc30869a\x2d497f\x2d4b61\x2db96d\x2d28cefb439c42.mount: Deactivated successfully. Feb 20 04:59:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:08.728 264355 INFO neutron.agent.dhcp.agent [None req-8f1db16b-8bb3-49dd-a911-4681008d3273 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:59:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:08.729 264355 INFO neutron.agent.dhcp.agent [None req-8f1db16b-8bb3-49dd-a911-4681008d3273 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 04:59:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e202 e202: 6 total, 6 up, 6 in Feb 20 04:59:10 localhost nova_compute[281288]: 2026-02-20 09:59:10.047 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:10 localhost nova_compute[281288]: 2026-02-20 09:59:10.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:10 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e203 e203: 6 total, 6 up, 6 in Feb 20 04:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:59:11 localhost podman[321216]: 2026-02-20 09:59:11.125833976 +0000 UTC m=+0.068268481 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 20 04:59:11 localhost podman[321217]: 2026-02-20 09:59:11.192742314 +0000 UTC m=+0.125035992 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Feb 20 04:59:11 localhost podman[321216]: 2026-02-20 09:59:11.223471801 +0000 UTC m=+0.165906286 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:59:11 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:59:11 localhost podman[321217]: 2026-02-20 09:59:11.27824089 +0000 UTC m=+0.210534587 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Feb 20 04:59:11 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:59:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:12 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:12 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:13.236 2 INFO neutron.agent.securitygroups_rpc [None req-02b2e18c-e6bb-49a4-a8f1-2084c77a3d21 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['aead394c-a7d3-40bc-acee-c30aa527c351']#033[00m Feb 20 04:59:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:13.237 2 INFO neutron.agent.securitygroups_rpc [None req-056c38b9-a9d3-4d30-8e17-1a44ab4fc9c9 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['aead394c-a7d3-40bc-acee-c30aa527c351']#033[00m Feb 20 04:59:13 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e204 e204: 6 total, 6 up, 6 in Feb 20 04:59:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:13.834 2 INFO neutron.agent.securitygroups_rpc [None req-5619c222-5cd3-438e-b875-1e00ee8b5a9d b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:13 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:13.835 2 INFO neutron.agent.securitygroups_rpc [None req-76a11935-2f93-444b-98b0-ed592d92678c b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:14.332 2 INFO neutron.agent.securitygroups_rpc [None req-92dae041-433a-447b-819c-ca016de78f58 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:14.395 2 INFO neutron.agent.securitygroups_rpc [None req-2cd7aa12-02dc-45d2-aacd-015fd7ca5faf b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:14.897 2 INFO neutron.agent.securitygroups_rpc [None req-8704f1e9-0786-45ef-9124-4ff6c69c9edf b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:14.898 2 INFO neutron.agent.securitygroups_rpc [None req-82bfdf30-c718-45ba-8302-76a78964efac b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:14.898 2 INFO neutron.agent.securitygroups_rpc [None req-6f3202e0-dd36-49f7-90de-4aa05c7d3120 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:14 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:14.899 2 INFO neutron.agent.securitygroups_rpc [None req-348b4a4e-d300-45af-acd6-5c08e553ddf3 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:15 localhost nova_compute[281288]: 2026-02-20 09:59:15.051 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:15 localhost nova_compute[281288]: 2026-02-20 09:59:15.053 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:15 localhost nova_compute[281288]: 2026-02-20 09:59:15.053 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:15 localhost nova_compute[281288]: 2026-02-20 09:59:15.053 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:15 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:15.468 2 INFO neutron.agent.securitygroups_rpc [None req-9fbdc85e-428a-4317-ba9b-cd92888d9cb2 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:15 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:15.470 2 INFO neutron.agent.securitygroups_rpc [None req-db51d6cf-e253-422e-b04f-9b8629f57782 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']#033[00m Feb 20 04:59:15 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:15.980 2 INFO neutron.agent.securitygroups_rpc [None req-af481eae-6194-40c9-88e4-bb7253323390 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['16efbbcf-ddc6-4434-9318-5d841ffddaef']#033[00m Feb 20 04:59:16 localhost nova_compute[281288]: 2026-02-20 09:59:16.010 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:16 localhost nova_compute[281288]: 2026-02-20 09:59:16.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:16.603 2 INFO neutron.agent.securitygroups_rpc [None req-8397ac5f-4c0e-48b4-864b-bbce3e3a32e8 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['868259ee-6cd3-44fa-b964-b511ba69ce8b']#033[00m Feb 20 04:59:16 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:16.738 2 INFO neutron.agent.securitygroups_rpc [None req-248bf9b5-6ff0-42de-8583-69a922702068 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['868259ee-6cd3-44fa-b964-b511ba69ce8b']#033[00m Feb 20 04:59:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e205 e205: 6 total, 6 up, 6 in Feb 20 04:59:17 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:17 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 04:59:17 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 04:59:17 localhost podman[241968]: time="2026-02-20T09:59:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:59:17 localhost podman[241968]: @ - - [20/Feb/2026:09:59:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:59:17 localhost podman[241968]: @ - - [20/Feb/2026:09:59:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18354 "" "Go-http-client/1.1" Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.319 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b50408a6-f1e1-4511-bb79-1516f3a6cbff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.320775', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd2401bde-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'ef80806b076098a93f2442ea2acb311d7a0791674fc8751ac507e6e3be259c78'}]}, 'timestamp': '2026-02-20 09:59:18.324528', '_unique_id': 'f57be228435647c58202a499e4ba1368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 04:59:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e206 e206: 6 total, 6 up, 6 in Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.350 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.350 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1755dc8-4733-450d-b8f5-6ba5b32d7368', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.326730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2441752-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'c34e8599db8aca9926a70877f22460a183c07535890d7bc63a6628c3a7df7851'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.326730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24425e4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '8e648ee3b9b6abf139b947304aedca2ce3b8d203012a46d60cf086e9005d5b85'}]}, 'timestamp': '2026-02-20 09:59:18.350939', '_unique_id': '6703c9e1cf6140e49a30412aa758f2ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae143fbb-cc78-456b-8b51-3907e9285d93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.352852', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd2447c38-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '9756a2c4a54c2ddd7d7b1f6fe250e635fe9ec5c6d925140848385273e1dca8bc'}]}, 'timestamp': '2026-02-20 09:59:18.353153', '_unique_id': 'd0c339835a654edfac92b6110469754a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.365 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '084730ce-2e99-4dc5-872c-85fb67319720', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.354520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2464c5c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '38633e92014546172936328eb874ca24bc206a189264a14af0d4d427381761fa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.354520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24658f0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': 'f55295629faf5a1a816f89324770016eb8f4fc142c32f4e16ef6ff40fe3d2273'}]}, 'timestamp': '2026-02-20 09:59:18.367992', '_unique_id': '0c7337abfdef46c79cb953f907b224b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.370 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.385 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 18870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aaa8438-6358-4da2-b711-92cdd50c9d0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18870000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:59:18.371054', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd24996a0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.624978983, 'message_signature': '83153108c7c9477705511cf6e1bf329322df1e9c4b7e888e44e24a94e9c2bf79'}]}, 'timestamp': '2026-02-20 09:59:18.386842', '_unique_id': '8372dad0397949a988dea59b3accff43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.390 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.390 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.390 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1571ecd-f191-4570-ab8e-5907ab470046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.390380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24a3b3c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': 'edd3c502dea069fc0d33a567ccd65b194c65a371adc099ecb6204522d5210f68'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.390380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24a4f46-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '0136080f8c88e9d093f609b35d7f2968d94c35beee13c019e76639f80f52d17a'}]}, 'timestamp': '2026-02-20 09:59:18.391383', '_unique_id': '515d2316d2e245d7882ffcc34c8aa3aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.393 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.393 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62dcd81c-0e3e-417f-8673-3784886d04e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.393784', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24abeb8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'f793bf70744fc94a00a589bf7c0167ee36dc848560f7f4e748ec4328fd72cff5'}]}, 'timestamp': '2026-02-20 09:59:18.394263', '_unique_id': '069774aba70f45b9adb7e90b1f6dd27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.396 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.396 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.397 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dd1e34a-e254-45f2-bfd2-f43853dee2ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.396703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24b310e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'f019db74b348b5cab53be197aa4fdcae964779dbec9812b6f999ce593725448c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.396703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24b439c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'd085530df47e212a585e2dfa8e69acf0f9d1e4a9770a77f42c4c742e1e79e84e'}]}, 'timestamp': '2026-02-20 09:59:18.397668', '_unique_id': 'd6b9787f82f34f8080f172e19343800e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.399 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0222a9e-6b7f-4eca-911f-6dfeb62dfc30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.399751', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24ba760-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '89320e7351cf2056cce931d310b2842f4b2a9a701d92a31b17041f87659edc6e'}]}, 'timestamp': '2026-02-20 09:59:18.400214', '_unique_id': '87f08278c9bc4ed59fa5e7f9526b4902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e175e9c-70fb-4569-aa41-5144468e6450', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.402282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24c0a20-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '1a15be2b4d6ab008da60eb527d178c64eb781a00f1852bcae597172b4c4e124d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.402282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24c1df8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '4ef65a0f3ce8131a6e258dc5be1b64c90926a205e266a27f5eb96d858bf2e2dd'}]}, 'timestamp': '2026-02-20 09:59:18.403249', '_unique_id': '84ffe443c8134b79bb2726773af3ca6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.405 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.406 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84cf1918-5238-4979-8c15-cbace1c22c69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.405672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24c8eb4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '0179f39443ea90b02ea08402da55c7b36e38956b6ac5c03db9725918d1862873'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.405672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24c9e5e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '31ec141441995e7f944cea4a86ee927c86c1cd1b56ffb0b12a01f1b541d782bf'}]}, 'timestamp': '2026-02-20 09:59:18.406507', '_unique_id': 'd4c68cca99eb4fc9a0092f327ad2c4f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.408 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.408 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.409 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcba8144-e7d8-4be9-9c4e-1ff7fbb7e27b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.408697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24d04ac-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '52d12aa817d16d64481e792c380ebfe0dbb77e6452da9b34c004aa6337c72c07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.408697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24d1442-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'dd114991cfeb47ac009d279865e907919033f6172e09b32473bd9eef832c3e30'}]}, 'timestamp': '2026-02-20 09:59:18.409546', '_unique_id': 'bc84ad709e7a489ea90095a6d364cf70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.411 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd468149f-d960-41f9-a3b1-b6e82e9f1661', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.411583', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24d775c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'c0cd07d1b60d254960152c229660b1ec68bc18d70f2d4a7375b5c420cb374398'}]}, 'timestamp': '2026-02-20 09:59:18.412090', '_unique_id': '278967fa9581497eb967cef853c0f15e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c1de490-9a13-4853-90fd-bf6ff0177128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.414145', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24dd904-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '7ba451f341cef8c0be3c5d2a4ea25707b61223be8043a85d718bb92e07b1f611'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.414145', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24de976-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'c220d6a5f6b5bd337094faf221bc6b27e54d45603771b9fe422e1784b9c5958b'}]}, 'timestamp': '2026-02-20 09:59:18.414984', '_unique_id': 'b09a0fcd78e544229018be28bc17927f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.417 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.417 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2621a8-9323-4038-b6a2-b091b88f994b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.417542', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24e5f1e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '0ca78064d1bdb496b6ab7091bb44ae854bc42b50a2e3b5b9dc10260cf65957df'}]}, 'timestamp': '2026-02-20 09:59:18.418027', '_unique_id': '75dcec9d16184fe1965586c5c1fb5d9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.420 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dbf80db-866c-4b26-9996-2ce7534fa5a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.420161', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24ec4c2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '75a51ebb29df916b0cb405ac8b18b1086a242a8ada8a4641ac0e2708a5575cd9'}]}, 'timestamp': '2026-02-20 09:59:18.420768', '_unique_id': '1c2e62ecd7d74b12b533d4e9ade8a2c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.422 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.422 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac274e2f-1a1f-4d75-b3ba-5e851cceb700', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.422705', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24f273c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'd6e99cec62f53e216e16615b8f39e8f8b1c1c19f40076498bde44b58337a8878'}]}, 'timestamp': '2026-02-20 09:59:18.423121', '_unique_id': '3cc8e9e7b1ac42948d5f0083b98b0413'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.424 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.425 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81fe8b6f-5d27-4b57-b588-a20c8d1b080d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.424581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24f70f2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '63e0c186b0a1374cbd6013846ba036e6c16ee85bc213c4870379b9d4b0abaf54'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.424581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24f7f98-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'ca87fa1ac095b3536ce2a2bbb162ca3a9696b27f9bf8424770e1185213d5572a'}]}, 'timestamp': '2026-02-20 09:59:18.425325', '_unique_id': 'ef7e86e4f864459eb29b361c3a1bcbf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.427 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.427 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5a816ce-9419-4a92-92b2-4ca8bf31b2f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:59:18.427124', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd24fd2cc-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.624978983, 'message_signature': '8081a672e73e09c3a3fd5459da9d844513568e5919625dd06d25c4452c8f922a'}]}, 'timestamp': '2026-02-20 09:59:18.427456', '_unique_id': 'caf19ab0db20431cb52bcc5fb7b32040'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d8e64da-fb24-4ae6-a820-bb5dc51c7704', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.428945', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd25018e0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '692ef11da7b3e68911f28f2ef830042b29744ded1fbe81cd6040861912a49550'}]}, 'timestamp': '2026-02-20 09:59:18.429315', '_unique_id': '1b487574895e48ccaf0a805df87dc64c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6f9e969-5370-4383-b565-6380f819ef39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.430994', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd2506a98-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '18a1e0f537eb3bce45e435b6f6b15aba1608eb6977baa1344ecba340e75dd4d2'}]}, 'timestamp': '2026-02-20 09:59:18.431374', '_unique_id': 'fd7578575f6547f18dd85410cdb8cdcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging yield Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 04:59:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Feb 20 04:59:18 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:18.745 2 INFO neutron.agent.securitygroups_rpc [None req-00ebe7d1-26f1-436c-a8d3-18ae30d4ceca b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['350e41a6-6799-4255-abb2-bda7d280e893']#033[00m Feb 20 04:59:18 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:18.935 2 INFO neutron.agent.securitygroups_rpc [None req-ab9099b6-173a-4528-9f76-ddc0c1b400ee b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['350e41a6-6799-4255-abb2-bda7d280e893']#033[00m Feb 20 04:59:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:59:19 localhost podman[321259]: 2026-02-20 09:59:19.149652615 +0000 UTC m=+0.087831997 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 04:59:19 localhost podman[321259]: 2026-02-20 09:59:19.160734713 +0000 UTC m=+0.098914085 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:59:19 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:59:19 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:59:19 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e207 e207: 6 total, 6 up, 6 in Feb 20 04:59:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:19.464 2 INFO neutron.agent.securitygroups_rpc [None req-483e27bf-9a6d-411a-b87b-b6f37447f4e8 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']#033[00m Feb 20 04:59:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:19.680 2 INFO neutron.agent.securitygroups_rpc [None req-ac337b7c-a535-40e3-b3bd-b5580b0e941d b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']#033[00m Feb 20 04:59:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:19.817 2 INFO neutron.agent.securitygroups_rpc [None req-c5f864ad-b631-4012-870f-280605d80045 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']#033[00m Feb 20 04:59:19 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:19.986 2 INFO neutron.agent.securitygroups_rpc [None req-ac1e61bd-c67f-4839-a794-1523a2080faa b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']#033[00m Feb 20 04:59:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:20.194 2 INFO neutron.agent.securitygroups_rpc [None req-7e4bce2e-fb70-442e-b47b-26c11122b51c b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']#033[00m Feb 20 04:59:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:20.356 2 INFO neutron.agent.securitygroups_rpc [None req-9c6d9773-e163-46df-a8b7-894bf61ef867 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']#033[00m Feb 20 04:59:20 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:20 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 20 04:59:20 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3226930422' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 20 04:59:20 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:20.777 2 INFO neutron.agent.securitygroups_rpc [None req-48000ec6-91b4-434a-ab1c-3ae5eaf7b735 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['cefa71e1-4cfe-4451-bb5c-ca133ddcf1fd']#033[00m Feb 20 04:59:21 localhost nova_compute[281288]: 2026-02-20 09:59:21.013 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:21 localhost nova_compute[281288]: 2026-02-20 09:59:21.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:21 localhost nova_compute[281288]: 2026-02-20 09:59:21.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:21 localhost nova_compute[281288]: 2026-02-20 09:59:21.016 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:21 localhost nova_compute[281288]: 2026-02-20 09:59:21.084 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:21 localhost nova_compute[281288]: 2026-02-20 09:59:21.085 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e208 e208: 6 total, 6 up, 6 in Feb 20 04:59:22 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e209 e209: 6 total, 6 up, 6 in Feb 20 04:59:22 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:22.576 2 INFO neutron.agent.securitygroups_rpc [None req-56dcf14a-a69d-4366-adc0-f7e0579b7cd8 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group rule updated ['9d889f17-f220-427e-bd61-2fb67b868596']#033[00m Feb 20 04:59:22 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:22.694 2 INFO neutron.agent.securitygroups_rpc [None req-87bed7c3-5c32-49ad-acc0-0a3642727263 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group rule updated ['9d889f17-f220-427e-bd61-2fb67b868596']#033[00m Feb 20 04:59:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e210 e210: 6 total, 6 up, 6 in Feb 20 04:59:23 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:59:23 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 04:59:23 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 04:59:24 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e211 e211: 6 total, 6 up, 6 in Feb 20 04:59:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e212 e212: 6 total, 6 up, 6 in Feb 20 04:59:26 localhost nova_compute[281288]: 2026-02-20 09:59:26.086 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:26 localhost nova_compute[281288]: 2026-02-20 09:59:26.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:26 localhost nova_compute[281288]: 2026-02-20 09:59:26.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:26 localhost nova_compute[281288]: 2026-02-20 09:59:26.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:59:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:26 localhost nova_compute[281288]: 2026-02-20 09:59:26.119 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:26 localhost nova_compute[281288]: 2026-02-20 09:59:26.121 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:26 localhost openstack_network_exporter[244414]: ERROR 09:59:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:59:26 localhost openstack_network_exporter[244414]: Feb 20 04:59:26 localhost openstack_network_exporter[244414]: ERROR 09:59:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:59:26 localhost openstack_network_exporter[244414]: Feb 20 04:59:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:59:27 localhost systemd[1]: tmp-crun.SWTleF.mount: Deactivated successfully. Feb 20 04:59:27 localhost podman[321277]: 2026-02-20 09:59:27.148837785 +0000 UTC m=+0.078817382 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 04:59:27 localhost podman[321277]: 2026-02-20 09:59:27.187253736 +0000 UTC m=+0.117233683 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 04:59:27 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:59:27 localhost neutron_sriov_agent[257177]: 2026-02-20 09:59:27.219 2 INFO neutron.agent.securitygroups_rpc [req-f264314a-f5fb-4167-9b9a-7fac156c481a req-f4a185a8-c20a-4c61-b6ac-a21285bd72eb 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group member updated ['9d889f17-f220-427e-bd61-2fb67b868596']#033[00m Feb 20 04:59:28 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e213 e213: 6 total, 6 up, 6 in Feb 20 04:59:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:59:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 04:59:30 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 04:59:31 localhost nova_compute[281288]: 2026-02-20 09:59:31.122 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:31 localhost nova_compute[281288]: 2026-02-20 09:59:31.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:31 localhost nova_compute[281288]: 2026-02-20 09:59:31.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:31 localhost nova_compute[281288]: 2026-02-20 09:59:31.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:31 localhost nova_compute[281288]: 2026-02-20 09:59:31.150 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:31 localhost nova_compute[281288]: 2026-02-20 09:59:31.151 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e214 e214: 6 total, 6 up, 6 in Feb 20 04:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 04:59:33 localhost systemd[1]: tmp-crun.RHrSIn.mount: Deactivated successfully. Feb 20 04:59:33 localhost podman[321300]: 2026-02-20 09:59:33.158831018 +0000 UTC m=+0.094599004 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 04:59:33 localhost podman[321300]: 2026-02-20 09:59:33.167464571 +0000 UTC m=+0.103232567 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 04:59:33 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 04:59:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:59:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 04:59:35 localhost podman[321324]: 2026-02-20 09:59:35.139379481 +0000 UTC m=+0.079172843 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 20 04:59:35 localhost podman[321324]: 2026-02-20 09:59:35.151329095 +0000 UTC m=+0.091122467 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 04:59:35 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 04:59:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e215 e215: 6 total, 6 up, 6 in Feb 20 04:59:36 localhost nova_compute[281288]: 2026-02-20 09:59:36.153 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:36 localhost nova_compute[281288]: 2026-02-20 09:59:36.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:36 localhost nova_compute[281288]: 2026-02-20 09:59:36.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:36 localhost nova_compute[281288]: 2026-02-20 09:59:36.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:36 localhost nova_compute[281288]: 2026-02-20 09:59:36.184 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:36 localhost nova_compute[281288]: 2026-02-20 09:59:36.184 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:36 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:59:36 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 04:59:36 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 04:59:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:36 localhost sshd[321344]: main: sshd: ssh-rsa algorithm is disabled Feb 20 04:59:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e216 e216: 6 total, 6 up, 6 in Feb 20 04:59:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:59:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:41 localhost nova_compute[281288]: 2026-02-20 09:59:41.186 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:41 localhost nova_compute[281288]: 2026-02-20 09:59:41.187 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:41 localhost nova_compute[281288]: 2026-02-20 09:59:41.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:41 localhost nova_compute[281288]: 2026-02-20 09:59:41.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:41 localhost nova_compute[281288]: 2026-02-20 09:59:41.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:41 localhost nova_compute[281288]: 2026-02-20 09:59:41.223 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 04:59:42 localhost podman[321347]: 2026-02-20 09:59:42.165281257 +0000 UTC m=+0.094365786 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 20 04:59:42 localhost podman[321347]: 2026-02-20 09:59:42.204047478 +0000 UTC m=+0.133131967 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 04:59:42 localhost systemd[1]: tmp-crun.ThTrLq.mount: Deactivated successfully. Feb 20 04:59:42 localhost podman[321346]: 2026-02-20 09:59:42.227217674 +0000 UTC m=+0.158760628 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 20 04:59:42 localhost ovn_controller[156798]: 2026-02-20T09:59:42Z|00398|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 04:59:42 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 04:59:42 localhost nova_compute[281288]: 2026-02-20 09:59:42.294 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:42 localhost podman[321346]: 2026-02-20 09:59:42.338299088 +0000 UTC m=+0.269842022 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 04:59:42 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 04:59:43 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 e217: 6 total, 6 up, 6 in Feb 20 04:59:43 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 04:59:43 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 04:59:43 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 04:59:46 localhost nova_compute[281288]: 2026-02-20 09:59:46.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:46 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 20 04:59:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:59:47 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 14K writes, 56K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s#012Cumulative WAL: 14K writes, 4683 syncs, 3.11 writes per sync, written: 0.04 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9472 writes, 33K keys, 9472 commit groups, 1.0 writes per commit group, ingest: 25.58 MB, 0.04 MB/s#012Interval WAL: 9472 writes, 3987 syncs, 2.38 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/52918c2e-6ed5-45c2-9872-88b3bd77010f/6bdf5028-e9f4-4a0d-812d-96892d0e92d0", "osd", "allow rw pool=manila_data namespace=fsvolumens_52918c2e-6ed5-45c2-9872-88b3bd77010f", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/52918c2e-6ed5-45c2-9872-88b3bd77010f/6bdf5028-e9f4-4a0d-812d-96892d0e92d0", "osd", "allow rw pool=manila_data namespace=fsvolumens_52918c2e-6ed5-45c2-9872-88b3bd77010f", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 04:59:47 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:59:47 localhost podman[241968]: time="2026-02-20T09:59:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 04:59:47 localhost podman[241968]: @ - - [20/Feb/2026:09:59:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 04:59:47 localhost podman[241968]: @ - - [20/Feb/2026:09:59:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1" Feb 20 04:59:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 04:59:50 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:50 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 04:59:50 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 04:59:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 04:59:50 localhost podman[321476]: 2026-02-20 09:59:50.161798917 +0000 UTC m=+0.085580189 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute) Feb 20 04:59:50 localhost podman[321476]: 2026-02-20 09:59:50.202087054 +0000 UTC m=+0.125868306 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 04:59:50 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 04:59:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 04:59:51 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/813520502' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 04:59:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 04:59:51 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/813520502' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 04:59:51 localhost nova_compute[281288]: 2026-02-20 09:59:51.226 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:51 localhost nova_compute[281288]: 2026-02-20 09:59:51.231 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 20 04:59:51 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 20K writes, 74K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 20K writes, 6880 syncs, 2.95 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 14K writes, 49K keys, 14K commit groups, 1.0 writes per commit group, ingest: 38.58 MB, 0.06 MB/s#012Interval WAL: 14K writes, 5950 syncs, 2.40 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 20 04:59:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 20 04:59:56 localhost nova_compute[281288]: 2026-02-20 09:59:56.260 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:56 localhost nova_compute[281288]: 2026-02-20 09:59:56.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 04:59:56 localhost nova_compute[281288]: 2026-02-20 09:59:56.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 04:59:56 localhost nova_compute[281288]: 2026-02-20 09:59:56.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:56 localhost nova_compute[281288]: 2026-02-20 09:59:56.263 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:56 localhost nova_compute[281288]: 2026-02-20 09:59:56.263 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 04:59:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 04:59:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 04:59:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 04:59:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-622295165", "format": "json"} : dispatch Feb 20 04:59:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-622295165", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e5c48d9-dcb2-469b-91b0-c7f808d95c49/17cc3f20-7a80-42d5-908d-fb1e4bba5d82", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:56 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-622295165", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e5c48d9-dcb2-469b-91b0-c7f808d95c49/17cc3f20-7a80-42d5-908d-fb1e4bba5d82", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 04:59:56 localhost openstack_network_exporter[244414]: ERROR 09:59:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 04:59:56 localhost openstack_network_exporter[244414]: Feb 20 04:59:56 localhost openstack_network_exporter[244414]: ERROR 09:59:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 04:59:56 localhost openstack_network_exporter[244414]: Feb 20 04:59:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 04:59:58 localhost systemd[1]: tmp-crun.neYjNm.mount: Deactivated successfully. Feb 20 04:59:58 localhost podman[321496]: 2026-02-20 09:59:58.167917655 +0000 UTC m=+0.103347150 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 04:59:58 localhost podman[321496]: 2026-02-20 09:59:58.204041505 +0000 UTC m=+0.139471000 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 04:59:58 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 04:59:58 localhost nova_compute[281288]: 2026-02-20 09:59:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:58 localhost nova_compute[281288]: 2026-02-20 09:59:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 04:59:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:58.940 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:59:58 localhost nova_compute[281288]: 2026-02-20 09:59:58.941 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:58 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:58.942 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 04:59:59 localhost neutron_dhcp_agent[264351]: 2026-02-20 09:59:59.274 264355 INFO neutron.agent.linux.ip_lib [None req-f95d6219-14d7-4ca9-bfb9-013969994773 - - - - - -] Device tapda8c9dd1-5a cannot be used as it has no MAC address#033[00m Feb 20 04:59:59 localhost nova_compute[281288]: 2026-02-20 09:59:59.305 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:59 localhost kernel: device tapda8c9dd1-5a entered promiscuous mode Feb 20 04:59:59 localhost NetworkManager[5988]: [1771581599.3182] manager: (tapda8c9dd1-5a): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Feb 20 04:59:59 localhost ovn_controller[156798]: 2026-02-20T09:59:59Z|00399|binding|INFO|Claiming lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 for this chassis. Feb 20 04:59:59 localhost ovn_controller[156798]: 2026-02-20T09:59:59Z|00400|binding|INFO|da8c9dd1-5a21-4397-88e4-37d2dfab4a31: Claiming unknown Feb 20 04:59:59 localhost systemd-udevd[321530]: Network interface NamePolicy= disabled on kernel command line. Feb 20 04:59:59 localhost nova_compute[281288]: 2026-02-20 09:59:59.322 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:59.330 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55965a8332c94f2da5d707adc081ab9c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b15549-1f91-4b7d-aa8a-1aa5d439e964, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da8c9dd1-5a21-4397-88e4-37d2dfab4a31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 04:59:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:59.333 162652 INFO neutron.agent.ovn.metadata.agent [-] Port da8c9dd1-5a21-4397-88e4-37d2dfab4a31 in datapath 3615f6b8-3945-4c93-ab04-14a8ee32065e bound to our chassis#033[00m Feb 20 04:59:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:59.336 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3615f6b8-3945-4c93-ab04-14a8ee32065e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 20 04:59:59 localhost ovn_metadata_agent[162647]: 2026-02-20 09:59:59.337 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d50e37c5-caa2-413c-a2df-161412833f45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost ovn_controller[156798]: 2026-02-20T09:59:59Z|00401|binding|INFO|Setting lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 ovn-installed in OVS Feb 20 04:59:59 localhost ovn_controller[156798]: 2026-02-20T09:59:59Z|00402|binding|INFO|Setting lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 up in Southbound Feb 20 04:59:59 localhost nova_compute[281288]: 2026-02-20 09:59:59.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:59 localhost nova_compute[281288]: 2026-02-20 09:59:59.361 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 04:59:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 04:59:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost nova_compute[281288]: 2026-02-20 09:59:59.398 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 04:59:59 localhost journal[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device Feb 20 04:59:59 localhost nova_compute[281288]: 2026-02-20 09:59:59.433 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:00 localhost podman[321600]: Feb 20 05:00:00 localhost podman[321600]: 2026-02-20 10:00:00.377372991 +0000 UTC m=+0.095375987 container create ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 20 05:00:00 localhost ceph-mon[301857]: overall HEALTH_OK Feb 20 05:00:00 localhost podman[321600]: 2026-02-20 10:00:00.329356108 +0000 UTC m=+0.047359134 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 05:00:00 localhost systemd[1]: Started libpod-conmon-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a.scope. Feb 20 05:00:00 localhost systemd[1]: Started libcrun container. Feb 20 05:00:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141d473fcfd8681de9cd6501789f09d5aafaf3822466c87fc964c4d88a2476a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 05:00:00 localhost podman[321600]: 2026-02-20 10:00:00.459864045 +0000 UTC m=+0.177867041 container init ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 05:00:00 localhost podman[321600]: 2026-02-20 10:00:00.470454058 +0000 UTC m=+0.188457054 container start ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 20 05:00:00 localhost dnsmasq[321618]: started, version 2.85 cachesize 150 Feb 20 05:00:00 localhost dnsmasq[321618]: DNS service limited to local subnets Feb 20 05:00:00 localhost dnsmasq[321618]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 05:00:00 localhost dnsmasq[321618]: warning: no upstream servers configured Feb 20 05:00:00 localhost dnsmasq-dhcp[321618]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 05:00:00 localhost dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 0 addresses Feb 20 05:00:00 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host Feb 20 05:00:00 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts Feb 20 05:00:00 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:00.583 264355 INFO neutron.agent.dhcp.agent [None req-dd6ae2f9-123b-49a7-803c-a7ded59e57e7 - - - - - -] DHCP configuration for ports {'74fd2a4a-9b9e-4edb-b114-d1121b443c64'} is completed#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.740 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.740 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.741 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.742 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:00:01 localhost nova_compute[281288]: 2026-02-20 10:00:01.760 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:00:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/738363458' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:00:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:00:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/738363458' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:00:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:00:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1177894248' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.185 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.253 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.254 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:00:02 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:02 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 05:00:02 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.492 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.494 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11250MB free_disk=41.70030212402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.494 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.495 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.573 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.574 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.574 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 05:00:02 localhost nova_compute[281288]: 2026-02-20 10:00:02.616 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:00:02 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:02.682 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:00:02Z, description=, device_id=06de8864-90cd-41d9-8a7d-9e83a5e36d4c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f42f7f41-995c-4a84-bc94-975a54360372, ip_allocation=immediate, mac_address=fa:16:3e:9e:1f:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:59:57Z, description=, dns_domain=, id=3615f6b8-3945-4c93-ab04-14a8ee32065e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1823604132-network, port_security_enabled=True, project_id=55965a8332c94f2da5d707adc081ab9c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36031, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3267, status=ACTIVE, subnets=['336a109d-999b-46ec-9331-c83fb6320087'], tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T09:59:57Z, vlan_transparent=None, network_id=3615f6b8-3945-4c93-ab04-14a8ee32065e, port_security_enabled=False, project_id=55965a8332c94f2da5d707adc081ab9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3279, status=DOWN, tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T10:00:02Z on network 3615f6b8-3945-4c93-ab04-14a8ee32065e#033[00m Feb 20 05:00:02 localhost dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 1 addresses Feb 20 05:00:02 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host Feb 20 05:00:02 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts Feb 20 05:00:02 localhost podman[321678]: 2026-02-20 10:00:02.914487202 +0000 UTC m=+0.060265418 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 05:00:02 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:02.945 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 05:00:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:00:03 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1389774980' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:00:03 localhost nova_compute[281288]: 2026-02-20 10:00:03.048 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:00:03 localhost nova_compute[281288]: 2026-02-20 10:00:03.055 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 05:00:03 localhost nova_compute[281288]: 2026-02-20 10:00:03.072 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 05:00:03 localhost nova_compute[281288]: 2026-02-20 10:00:03.075 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 05:00:03 localhost nova_compute[281288]: 2026-02-20 10:00:03.075 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:00:03 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:03.139 264355 INFO neutron.agent.dhcp.agent [None req-77ba4fff-74cb-401d-a84c-fa0d92a6d99c - - - - - -] DHCP configuration for ports {'f42f7f41-995c-4a84-bc94-975a54360372'} is completed#033[00m Feb 20 05:00:03 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-622295165", "format": "json"} : dispatch Feb 20 05:00:03 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-622295165"} : dispatch Feb 20 05:00:03 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-622295165"}]': finished Feb 20 05:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:00:04 localhost nova_compute[281288]: 2026-02-20 10:00:04.078 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:04 localhost nova_compute[281288]: 2026-02-20 10:00:04.080 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:04 localhost podman[321700]: 2026-02-20 10:00:04.149885292 +0000 UTC m=+0.085610260 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 05:00:04 localhost podman[321700]: 2026-02-20 10:00:04.165141956 +0000 UTC m=+0.100866904 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 05:00:04 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:00:05 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:05.022 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:00:02Z, description=, device_id=06de8864-90cd-41d9-8a7d-9e83a5e36d4c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f42f7f41-995c-4a84-bc94-975a54360372, ip_allocation=immediate, mac_address=fa:16:3e:9e:1f:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:59:57Z, description=, dns_domain=, id=3615f6b8-3945-4c93-ab04-14a8ee32065e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1823604132-network, port_security_enabled=True, project_id=55965a8332c94f2da5d707adc081ab9c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36031, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3267, status=ACTIVE, subnets=['336a109d-999b-46ec-9331-c83fb6320087'], tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T09:59:57Z, vlan_transparent=None, network_id=3615f6b8-3945-4c93-ab04-14a8ee32065e, port_security_enabled=False, project_id=55965a8332c94f2da5d707adc081ab9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3279, status=DOWN, tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T10:00:02Z on network 3615f6b8-3945-4c93-ab04-14a8ee32065e#033[00m Feb 20 05:00:05 localhost podman[321739]: 2026-02-20 10:00:05.241126679 +0000 UTC m=+0.060340859 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2) Feb 20 05:00:05 localhost dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 1 addresses Feb 20 05:00:05 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host Feb 20 05:00:05 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts Feb 20 05:00:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:00:05 localhost podman[321752]: 2026-02-20 10:00:05.345478079 +0000 UTC m=+0.078075790 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git) Feb 20 05:00:05 localhost podman[321752]: 2026-02-20 10:00:05.389090777 +0000 UTC m=+0.121688478 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-type=git, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 20 05:00:05 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:00:05 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:05.569 264355 INFO neutron.agent.dhcp.agent [None req-b4114ec0-08bd-4e83-86c0-406fd7915555 - - - - - -] DHCP configuration for ports {'f42f7f41-995c-4a84-bc94-975a54360372'} is completed#033[00m Feb 20 05:00:05 localhost nova_compute[281288]: 2026-02-20 10:00:05.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:05 localhost nova_compute[281288]: 2026-02-20 10:00:05.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 05:00:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:06.022 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:00:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:00:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:00:06 localhost nova_compute[281288]: 2026-02-20 10:00:06.328 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:06 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:06 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:06 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:06 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 20 05:00:06 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 20 05:00:06 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Feb 20 05:00:06 localhost nova_compute[281288]: 2026-02-20 10:00:06.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:06 localhost nova_compute[281288]: 2026-02-20 10:00:06.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.820 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.820 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.820 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 05:00:07 localhost nova_compute[281288]: 2026-02-20 10:00:07.821 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 05:00:08 localhost sshd[321777]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:00:08 localhost nova_compute[281288]: 2026-02-20 10:00:08.585 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 05:00:08 localhost nova_compute[281288]: 2026-02-20 10:00:08.607 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 05:00:08 localhost nova_compute[281288]: 2026-02-20 10:00:08.607 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 05:00:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e218 e218: 6 total, 6 up, 6 in Feb 20 05:00:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e219 e219: 6 total, 6 up, 6 in Feb 20 05:00:09 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:09 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 05:00:09 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 05:00:09 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Feb 20 05:00:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e220 e220: 6 total, 6 up, 6 in Feb 20 05:00:11 localhost nova_compute[281288]: 2026-02-20 10:00:11.332 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:11 localhost nova_compute[281288]: 2026-02-20 10:00:11.334 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:11 localhost nova_compute[281288]: 2026-02-20 10:00:11.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:11 localhost nova_compute[281288]: 2026-02-20 10:00:11.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:11 localhost nova_compute[281288]: 2026-02-20 10:00:11.391 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:11 localhost nova_compute[281288]: 2026-02-20 10:00:11.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:00:13 localhost systemd[1]: tmp-crun.NSrdvE.mount: Deactivated successfully. Feb 20 05:00:13 localhost podman[321780]: 2026-02-20 10:00:13.164816868 +0000 UTC m=+0.094315485 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:00:13 localhost podman[321780]: 2026-02-20 10:00:13.197284498 +0000 UTC m=+0.126783145 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:00:13 localhost systemd[1]: tmp-crun.9yztSu.mount: Deactivated successfully. Feb 20 05:00:13 localhost podman[321779]: 2026-02-20 10:00:13.206049195 +0000 UTC m=+0.138831792 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 05:00:13 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:00:13 localhost podman[321779]: 2026-02-20 10:00:13.266281629 +0000 UTC m=+0.199064266 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:00:13 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:00:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e221 e221: 6 total, 6 up, 6 in Feb 20 05:00:14 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 20 05:00:14 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/7f87993f-62fd-4706-b657-9586f12f2a62/5d470986-550e-47dc-98aa-07cb4fcc93dc", "osd", "allow rw pool=manila_data namespace=fsvolumens_7f87993f-62fd-4706-b657-9586f12f2a62", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:14 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/7f87993f-62fd-4706-b657-9586f12f2a62/5d470986-550e-47dc-98aa-07cb4fcc93dc", "osd", "allow rw pool=manila_data namespace=fsvolumens_7f87993f-62fd-4706-b657-9586f12f2a62", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:14 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:14 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:14 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e222 e222: 6 total, 6 up, 6 in Feb 20 05:00:16 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:16 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 05:00:16 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 05:00:16 localhost nova_compute[281288]: 2026-02-20 10:00:16.393 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:16 localhost nova_compute[281288]: 2026-02-20 10:00:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:16 localhost nova_compute[281288]: 2026-02-20 10:00:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:16 localhost nova_compute[281288]: 2026-02-20 10:00:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:16 localhost nova_compute[281288]: 2026-02-20 10:00:16.422 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:16 localhost nova_compute[281288]: 2026-02-20 10:00:16.423 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e223 e223: 6 total, 6 up, 6 in Feb 20 05:00:17 localhost podman[241968]: time="2026-02-20T10:00:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:00:17 localhost podman[241968]: @ - - [20/Feb/2026:10:00:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 20 05:00:17 localhost podman[241968]: @ - - [20/Feb/2026:10:00:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18836 "" "Go-http-client/1.1" Feb 20 05:00:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e224 e224: 6 total, 6 up, 6 in Feb 20 05:00:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:19 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:19 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:19 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 20 05:00:20 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e225 e225: 6 total, 6 up, 6 in Feb 20 05:00:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:00:21 localhost podman[321822]: 2026-02-20 10:00:21.154988613 +0000 UTC m=+0.086102445 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute) Feb 20 05:00:21 localhost sshd[321837]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:00:21 localhost podman[321822]: 2026-02-20 10:00:21.191206087 +0000 UTC m=+0.122319919 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 20 05:00:21 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:00:21 localhost nova_compute[281288]: 2026-02-20 10:00:21.424 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:21 localhost nova_compute[281288]: 2026-02-20 10:00:21.426 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:21 localhost nova_compute[281288]: 2026-02-20 10:00:21.426 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:21 localhost nova_compute[281288]: 2026-02-20 10:00:21.426 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:21 localhost nova_compute[281288]: 2026-02-20 10:00:21.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:21 localhost nova_compute[281288]: 2026-02-20 10:00:21.442 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:22 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e226 e226: 6 total, 6 up, 6 in Feb 20 05:00:23 localhost ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Feb 20 05:00:23 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:23 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 05:00:23 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 05:00:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e227 e227: 6 total, 6 up, 6 in Feb 20 05:00:25 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 05:00:25 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e228 e228: 6 total, 6 up, 6 in Feb 20 05:00:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 20 05:00:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 20 05:00:26 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Feb 20 05:00:26 localhost nova_compute[281288]: 2026-02-20 10:00:26.443 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:26 localhost nova_compute[281288]: 2026-02-20 10:00:26.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:26 localhost nova_compute[281288]: 2026-02-20 10:00:26.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:26 localhost nova_compute[281288]: 2026-02-20 10:00:26.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:26 localhost nova_compute[281288]: 2026-02-20 10:00:26.445 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:26 localhost nova_compute[281288]: 2026-02-20 10:00:26.448 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:26 localhost openstack_network_exporter[244414]: ERROR 10:00:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:00:26 localhost openstack_network_exporter[244414]: Feb 20 05:00:26 localhost openstack_network_exporter[244414]: ERROR 10:00:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:00:26 localhost openstack_network_exporter[244414]: Feb 20 05:00:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e229 e229: 6 total, 6 up, 6 in Feb 20 05:00:28 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e230 e230: 6 total, 6 up, 6 in Feb 20 05:00:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:00:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 05:00:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 05:00:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 05:00:29 localhost podman[321844]: 2026-02-20 10:00:29.185629602 +0000 UTC m=+0.120094430 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 05:00:29 localhost podman[321844]: 2026-02-20 10:00:29.21905801 +0000 UTC m=+0.153522828 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 05:00:29 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:00:30 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e231 e231: 6 total, 6 up, 6 in Feb 20 05:00:31 localhost nova_compute[281288]: 2026-02-20 10:00:31.450 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:31 localhost nova_compute[281288]: 2026-02-20 10:00:31.450 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:31 localhost nova_compute[281288]: 2026-02-20 10:00:31.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:31 localhost nova_compute[281288]: 2026-02-20 10:00:31.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:32 localhost neutron_sriov_agent[257177]: 2026-02-20 10:00:32.128 2 INFO neutron.agent.securitygroups_rpc [req-dff2b32a-81fc-4277-8af6-ada27919a489 req-cd96c648-ee5c-4ce2-9c39-be0fe03b42e9 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group member updated ['9d889f17-f220-427e-bd61-2fb67b868596']#033[00m Feb 20 05:00:32 localhost nova_compute[281288]: 2026-02-20 10:00:32.161 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:32 localhost nova_compute[281288]: 2026-02-20 10:00:32.162 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 05:00:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:33 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e232 e232: 6 total, 6 up, 6 in Feb 20 05:00:33 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:34 localhost sshd[321865]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:00:34 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e233 e233: 6 total, 6 up, 6 in Feb 20 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:00:35 localhost podman[321867]: 2026-02-20 10:00:35.129052687 +0000 UTC m=+0.067729845 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 05:00:35 localhost podman[321867]: 2026-02-20 10:00:35.138876097 +0000 UTC m=+0.077553275 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 05:00:35 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:00:35 localhost systemd[1]: tmp-crun.PmSZuu.mount: Deactivated successfully. Feb 20 05:00:35 localhost podman[321890]: 2026-02-20 10:00:35.5272588 +0000 UTC m=+0.085628760 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Feb 20 05:00:35 localhost podman[321890]: 2026-02-20 10:00:35.539228634 +0000 UTC m=+0.097598544 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, release=1770267347, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7) Feb 20 05:00:35 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:00:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 20 05:00:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 20 05:00:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 20 05:00:35 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e234 e234: 6 total, 6 up, 6 in Feb 20 05:00:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e235 e235: 6 total, 6 up, 6 in Feb 20 05:00:37 localhost nova_compute[281288]: 2026-02-20 10:00:37.165 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:37 localhost nova_compute[281288]: 2026-02-20 10:00:37.167 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:37 localhost nova_compute[281288]: 2026-02-20 10:00:37.167 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:37 localhost nova_compute[281288]: 2026-02-20 10:00:37.168 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:37 localhost nova_compute[281288]: 2026-02-20 10:00:37.189 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:37 localhost nova_compute[281288]: 2026-02-20 10:00:37.190 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:00:37 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/711179978' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:00:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:00:37 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/711179978' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:00:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:00:37 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/660812735' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:00:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:00:37 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/660812735' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:00:38 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e236 e236: 6 total, 6 up, 6 in Feb 20 05:00:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:39 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e237 e237: 6 total, 6 up, 6 in Feb 20 05:00:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:00:40 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2569557107' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:00:40 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:00:40 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2569557107' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:00:41 localhost sshd[321912]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:00:41 localhost ovn_controller[156798]: 2026-02-20T10:00:41Z|00403|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:00:41 localhost nova_compute[281288]: 2026-02-20 10:00:41.442 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 05:00:42 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 05:00:42 localhost nova_compute[281288]: 2026-02-20 10:00:42.190 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:42 localhost nova_compute[281288]: 2026-02-20 10:00:42.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:43 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e238 e238: 6 total, 6 up, 6 in Feb 20 05:00:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:00:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:00:44 localhost systemd[1]: tmp-crun.SO1pgj.mount: Deactivated successfully. Feb 20 05:00:44 localhost podman[321915]: 2026-02-20 10:00:44.172419011 +0000 UTC m=+0.104789585 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true) Feb 20 05:00:44 localhost podman[321915]: 2026-02-20 10:00:44.182035973 +0000 UTC m=+0.114406547 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Feb 20 05:00:44 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:00:44 localhost podman[321914]: 2026-02-20 10:00:44.270748136 +0000 UTC m=+0.205191463 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Feb 20 05:00:44 localhost podman[321914]: 2026-02-20 10:00:44.341711488 +0000 UTC m=+0.276154855 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 05:00:44 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:00:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:45 localhost nova_compute[281288]: 2026-02-20 10:00:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:00:45 localhost nova_compute[281288]: 2026-02-20 10:00:45.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 20 05:00:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:47 localhost nova_compute[281288]: 2026-02-20 10:00:47.193 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:47 localhost nova_compute[281288]: 2026-02-20 10:00:47.195 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:00:47 localhost nova_compute[281288]: 2026-02-20 10:00:47.195 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:00:47 localhost nova_compute[281288]: 2026-02-20 10:00:47.195 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:47 localhost nova_compute[281288]: 2026-02-20 10:00:47.196 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:47 localhost nova_compute[281288]: 2026-02-20 10:00:47.196 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:00:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e239 e239: 6 total, 6 up, 6 in Feb 20 05:00:47 localhost podman[241968]: time="2026-02-20T10:00:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:00:47 localhost podman[241968]: @ - - [20/Feb/2026:10:00:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 20 05:00:47 localhost podman[241968]: @ - - [20/Feb/2026:10:00:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1" Feb 20 05:00:48 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:00:48 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3370685851' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:00:48 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:00:48 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3370685851' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:00:48 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 20 05:00:48 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 20 05:00:48 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 20 05:00:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 05:00:49 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:00:50 localhost dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 0 addresses Feb 20 05:00:50 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host Feb 20 05:00:50 localhost dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts Feb 20 05:00:50 localhost podman[322061]: 2026-02-20 10:00:50.650955859 +0000 UTC m=+0.071827439 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:00:50 localhost systemd[1]: tmp-crun.InfOxR.mount: Deactivated successfully. Feb 20 05:00:50 localhost kernel: device tapda8c9dd1-5a left promiscuous mode Feb 20 05:00:50 localhost ovn_controller[156798]: 2026-02-20T10:00:50Z|00404|binding|INFO|Releasing lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 from this chassis (sb_readonly=0) Feb 20 05:00:50 localhost ovn_controller[156798]: 2026-02-20T10:00:50Z|00405|binding|INFO|Setting lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 down in Southbound Feb 20 05:00:50 localhost nova_compute[281288]: 2026-02-20 10:00:50.890 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:50 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:50.898 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55965a8332c94f2da5d707adc081ab9c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b15549-1f91-4b7d-aa8a-1aa5d439e964, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da8c9dd1-5a21-4397-88e4-37d2dfab4a31) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:00:50 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:50.902 162652 INFO neutron.agent.ovn.metadata.agent [-] Port da8c9dd1-5a21-4397-88e4-37d2dfab4a31 in datapath 3615f6b8-3945-4c93-ab04-14a8ee32065e unbound from our chassis#033[00m Feb 20 05:00:50 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:50.905 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3615f6b8-3945-4c93-ab04-14a8ee32065e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:00:50 localhost ovn_metadata_agent[162647]: 2026-02-20 10:00:50.907 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3b068111-fbec-4c9f-a72b-0de63a4775a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:00:50 localhost nova_compute[281288]: 2026-02-20 10:00:50.916 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:00:52 localhost podman[322084]: 2026-02-20 10:00:52.165925857 +0000 UTC m=+0.100315388 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 20 05:00:52 localhost podman[322084]: 2026-02-20 10:00:52.179951194 +0000 UTC m=+0.114340755 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 20 05:00:52 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:00:52 localhost nova_compute[281288]: 2026-02-20 10:00:52.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:52 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:52 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:52 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:52 localhost ovn_controller[156798]: 2026-02-20T10:00:52Z|00406|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:00:52 localhost nova_compute[281288]: 2026-02-20 10:00:52.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e240 e240: 6 total, 6 up, 6 in Feb 20 05:00:53 localhost dnsmasq[321618]: exiting on receipt of SIGTERM Feb 20 05:00:53 localhost systemd[1]: tmp-crun.dgjJm7.mount: Deactivated successfully. Feb 20 05:00:53 localhost podman[322121]: 2026-02-20 10:00:53.379802431 +0000 UTC m=+0.078256415 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 20 05:00:53 localhost systemd[1]: libpod-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a.scope: Deactivated successfully. Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.412796) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653412848, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2908, "num_deletes": 273, "total_data_size": 4087565, "memory_usage": 4222976, "flush_reason": "Manual Compaction"} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653424790, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 2672399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25896, "largest_seqno": 28799, "table_properties": {"data_size": 2660779, "index_size": 7293, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 29325, "raw_average_key_size": 22, "raw_value_size": 2635779, "raw_average_value_size": 2044, "num_data_blocks": 311, "num_entries": 1289, "num_filter_entries": 1289, "num_deletions": 273, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581543, "oldest_key_time": 1771581543, "file_creation_time": 1771581653, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 12047 microseconds, and 6083 cpu microseconds. Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.424845) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 2672399 bytes OK Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.424871) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.426746) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.426775) EVENT_LOG_v1 {"time_micros": 1771581653426768, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.426801) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 4073584, prev total WAL file size 4073584, number of live WAL files 2. Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.427582) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(2609KB)], [39(18MB)] Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653427646, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 21603699, "oldest_snapshot_seqno": -1} Feb 20 05:00:53 localhost podman[322134]: 2026-02-20 10:00:53.467586116 +0000 UTC m=+0.069421846 container died ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 13874 keys, 19893963 bytes, temperature: kUnknown Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653519810, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 19893963, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19810293, "index_size": 47921, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34693, "raw_key_size": 369904, "raw_average_key_size": 26, "raw_value_size": 19570192, "raw_average_value_size": 1410, "num_data_blocks": 1820, "num_entries": 13874, "num_filter_entries": 13874, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581653, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.520437) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 19893963 bytes Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.522835) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 233.9 rd, 215.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 18.1 +0.0 blob) out(19.0 +0.0 blob), read-write-amplify(15.5) write-amplify(7.4) OK, records in: 14430, records dropped: 556 output_compression: NoCompression Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.522877) EVENT_LOG_v1 {"time_micros": 1771581653522860, "job": 22, "event": "compaction_finished", "compaction_time_micros": 92371, "compaction_time_cpu_micros": 52560, "output_level": 6, "num_output_files": 1, "total_output_size": 19893963, "num_input_records": 14430, "num_output_records": 13874, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653523893, "job": 22, "event": "table_file_deletion", "file_number": 41} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653528030, "job": 22, "event": "table_file_deletion", "file_number": 39} Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.427477) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528183) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:00:53 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:00:53 localhost podman[322134]: 2026-02-20 10:00:53.591167402 +0000 UTC m=+0.193003092 container cleanup ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:00:53 localhost systemd[1]: libpod-conmon-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a.scope: Deactivated successfully. Feb 20 05:00:53 localhost podman[322136]: 2026-02-20 10:00:53.614371479 +0000 UTC m=+0.203972026 container remove ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 05:00:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:53.640 264355 INFO neutron.agent.dhcp.agent [None req-56cd3450-5a3d-4da5-ac04-6a75b95f3e88 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:00:53 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:00:53.641 264355 INFO neutron.agent.dhcp.agent [None req-56cd3450-5a3d-4da5-ac04-6a75b95f3e88 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:00:54 localhost systemd[1]: tmp-crun.AAus7f.mount: Deactivated successfully. Feb 20 05:00:54 localhost systemd[1]: var-lib-containers-storage-overlay-141d473fcfd8681de9cd6501789f09d5aafaf3822466c87fc964c4d88a2476a2-merged.mount: Deactivated successfully. Feb 20 05:00:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a-userdata-shm.mount: Deactivated successfully. Feb 20 05:00:54 localhost systemd[1]: run-netns-qdhcp\x2d3615f6b8\x2d3945\x2d4c93\x2dab04\x2d14a8ee32065e.mount: Deactivated successfully. Feb 20 05:00:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:00:55 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:55 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 05:00:55 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 05:00:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:00:56 localhost openstack_network_exporter[244414]: ERROR 10:00:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:00:56 localhost openstack_network_exporter[244414]: Feb 20 05:00:56 localhost openstack_network_exporter[244414]: ERROR 10:00:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:00:56 localhost openstack_network_exporter[244414]: Feb 20 05:00:56 localhost sshd[322163]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:00:57 localhost nova_compute[281288]: 2026-02-20 10:00:57.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:00:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e241 e241: 6 total, 6 up, 6 in Feb 20 05:00:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:00:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:00:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:00:59 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e242 e242: 6 total, 6 up, 6 in Feb 20 05:00:59 localhost nova_compute[281288]: 2026-02-20 10:00:59.738 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:01:00 localhost podman[322165]: 2026-02-20 10:01:00.15337282 +0000 UTC m=+0.081612558 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 05:01:00 localhost podman[322165]: 2026-02-20 10:01:00.166427408 +0000 UTC m=+0.094667206 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 05:01:00 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:01:00 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e243 e243: 6 total, 6 up, 6 in Feb 20 05:01:00 localhost nova_compute[281288]: 2026-02-20 10:01:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:01:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1908739206' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:01:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:01:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1908739206' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.279 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.281 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.316 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.317 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:02 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 20 05:01:02 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 20 05:01:02 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 20 05:01:02 localhost ovn_metadata_agent[162647]: 2026-02-20 10:01:02.490 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.492 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:02 localhost ovn_metadata_agent[162647]: 2026-02-20 10:01:02.493 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.746 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 05:01:02 localhost nova_compute[281288]: 2026-02-20 10:01:02.746 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:01:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e244 e244: 6 total, 6 up, 6 in Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.103314) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663103411, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 466, "num_deletes": 250, "total_data_size": 413035, "memory_usage": 422512, "flush_reason": "Manual Compaction"} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663107976, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 270910, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28805, "largest_seqno": 29265, "table_properties": {"data_size": 268258, "index_size": 699, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7400, "raw_average_key_size": 21, "raw_value_size": 262609, "raw_average_value_size": 746, "num_data_blocks": 31, "num_entries": 352, "num_filter_entries": 352, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581653, "oldest_key_time": 1771581653, "file_creation_time": 1771581663, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4696 microseconds, and 1741 cpu microseconds. Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.108024) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 270910 bytes OK Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.108047) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.109570) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.109590) EVENT_LOG_v1 {"time_micros": 1771581663109583, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.109615) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 410099, prev total WAL file size 410423, number of live WAL files 2. Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.111326) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303036' seq:72057594037927935, type:22 .. '6D6772737461740034323537' seq:0, type:0; will stop at (end) Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(264KB)], [42(18MB)] Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663111402, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 20164873, "oldest_snapshot_seqno": -1} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 13703 keys, 18046533 bytes, temperature: kUnknown Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663167157, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 18046533, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17968485, "index_size": 42683, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34309, "raw_key_size": 366697, "raw_average_key_size": 26, "raw_value_size": 17735848, "raw_average_value_size": 1294, "num_data_blocks": 1601, "num_entries": 13703, "num_filter_entries": 13703, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581663, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.167414) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 18046533 bytes Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.168597) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 360.9 rd, 323.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 19.0 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(141.0) write-amplify(66.6) OK, records in: 14226, records dropped: 523 output_compression: NoCompression Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.168612) EVENT_LOG_v1 {"time_micros": 1771581663168605, "job": 24, "event": "compaction_finished", "compaction_time_micros": 55872, "compaction_time_cpu_micros": 25635, "output_level": 6, "num_output_files": 1, "total_output_size": 18046533, "num_input_records": 14226, "num_output_records": 13703, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663168750, "job": 24, "event": "table_file_deletion", "file_number": 44} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663170223, "job": 24, "event": "table_file_deletion", "file_number": 42} Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.111208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:01:03 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:01:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:01:03 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/884298509' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.257 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.308 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.308 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.511 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.512 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11235MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.512 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.513 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.569 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.570 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.570 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 05:01:03 localhost nova_compute[281288]: 2026-02-20 10:01:03.615 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:01:04 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:01:04 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2041852401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:01:04 localhost nova_compute[281288]: 2026-02-20 10:01:04.123 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:01:04 localhost nova_compute[281288]: 2026-02-20 10:01:04.131 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 05:01:04 localhost nova_compute[281288]: 2026-02-20 10:01:04.148 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 05:01:04 localhost nova_compute[281288]: 2026-02-20 10:01:04.151 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 05:01:04 localhost nova_compute[281288]: 2026-02-20 10:01:04.151 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:01:05 localhost nova_compute[281288]: 2026-02-20 10:01:05.153 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:05 localhost nova_compute[281288]: 2026-02-20 10:01:05.153 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e245 e245: 6 total, 6 up, 6 in Feb 20 05:01:05 localhost sshd[322244]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:01:05 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 20 05:01:05 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:01:05 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:01:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:01:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:01:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:01:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:01:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:01:06.024 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:01:06 localhost podman[322247]: 2026-02-20 10:01:06.135101219 +0000 UTC m=+0.073239912 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:01:06 localhost podman[322247]: 2026-02-20 10:01:06.14495459 +0000 UTC m=+0.083093313 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 05:01:06 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:01:06 localhost podman[322246]: 2026-02-20 10:01:06.21091084 +0000 UTC m=+0.150706603 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1770267347, container_name=openstack_network_exporter, version=9.7, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible) Feb 20 05:01:06 localhost podman[322246]: 2026-02-20 10:01:06.227271307 +0000 UTC m=+0.167067080 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 05:01:06 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:01:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:06 localhost nova_compute[281288]: 2026-02-20 10:01:06.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e246 e246: 6 total, 6 up, 6 in Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.346 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.349 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.810 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.810 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 05:01:07 localhost nova_compute[281288]: 2026-02-20 10:01:07.810 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 05:01:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e247 e247: 6 total, 6 up, 6 in Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.302 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.329 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.330 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.330 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.331 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.331 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.739 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.739 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 20 05:01:08 localhost nova_compute[281288]: 2026-02-20 10:01:08.757 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 20 05:01:09 localhost sshd[322289]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:01:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:11 localhost ovn_metadata_agent[162647]: 2026-02-20 10:01:11.495 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 05:01:12 localhost nova_compute[281288]: 2026-02-20 10:01:12.393 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:12 localhost nova_compute[281288]: 2026-02-20 10:01:12.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:12 localhost nova_compute[281288]: 2026-02-20 10:01:12.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:12 localhost nova_compute[281288]: 2026-02-20 10:01:12.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:12 localhost nova_compute[281288]: 2026-02-20 10:01:12.396 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:12 localhost nova_compute[281288]: 2026-02-20 10:01:12.397 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:12 localhost sshd[322291]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:01:13 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e248 e248: 6 total, 6 up, 6 in Feb 20 05:01:13 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 20 05:01:13 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb,allow rw path=/volumes/_nogroup/8be201ef-8dd5-4872-91e4-0290b94f4b6d/d3136894-05be-4407-a87e-f697b6b3efd1", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5,allow rw pool=manila_data namespace=fsvolumens_8be201ef-8dd5-4872-91e4-0290b94f4b6d"]} : dispatch Feb 20 05:01:13 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb,allow rw path=/volumes/_nogroup/8be201ef-8dd5-4872-91e4-0290b94f4b6d/d3136894-05be-4407-a87e-f697b6b3efd1", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5,allow rw pool=manila_data namespace=fsvolumens_8be201ef-8dd5-4872-91e4-0290b94f4b6d"]}]': finished Feb 20 05:01:13 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 20 05:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:01:14 localhost podman[322293]: 2026-02-20 10:01:14.37514635 +0000 UTC m=+0.090383504 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 20 05:01:14 localhost podman[322293]: 2026-02-20 10:01:14.393021784 +0000 UTC m=+0.108258958 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 20 05:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:01:14 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e249 e249: 6 total, 6 up, 6 in Feb 20 05:01:14 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:01:14 localhost podman[322311]: 2026-02-20 10:01:14.489925657 +0000 UTC m=+0.083995360 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Feb 20 05:01:14 localhost podman[322311]: 2026-02-20 10:01:14.560042464 +0000 UTC m=+0.154112227 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 20 05:01:14 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:01:15 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e250 e250: 6 total, 6 up, 6 in Feb 20 05:01:16 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 20 05:01:16 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5"]} : dispatch Feb 20 05:01:16 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5"]}]': finished Feb 20 05:01:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:17 localhost nova_compute[281288]: 2026-02-20 10:01:17.398 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:17 localhost nova_compute[281288]: 2026-02-20 10:01:17.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:17 localhost nova_compute[281288]: 2026-02-20 10:01:17.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:17 localhost nova_compute[281288]: 2026-02-20 10:01:17.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:17 localhost nova_compute[281288]: 2026-02-20 10:01:17.417 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:17 localhost nova_compute[281288]: 2026-02-20 10:01:17.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:17 localhost podman[241968]: time="2026-02-20T10:01:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:01:17 localhost podman[241968]: @ - - [20/Feb/2026:10:01:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:01:17 localhost podman[241968]: @ - - [20/Feb/2026:10:01:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18358 "" "Go-http-client/1.1" Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.321 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc0f878e-2266-47f8-b48a-659ec4f14efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.322275', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19c705d0-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '3568909fd73a0d0e57e5812a6e4f8bc9dc3a3776ec3e0fc29e2553d4c80c0331'}]}, 'timestamp': '2026-02-20 10:01:18.327044', '_unique_id': 'c9aa78174bdc4a6e9cc5dddffd597d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25578c2c-45ed-4df3-b693-36ddce741b38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.330217', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19c79806-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '089e6df68f40619e40920f696b7fb41614e3cbf54edbdd8cb4b6ce26a27dce1d'}]}, 'timestamp': '2026-02-20 10:01:18.330772', '_unique_id': 'f83e8445e94e49b39f2c805afdcc6de3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.360 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5d7a547-1189-479f-8fa8-28b4a188eb7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.333003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19cc3884-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'c0536afcfac39382aecad9fc7dabe368aa66a9024d51d5c3bbdd20153d1eae59'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.333003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cc4eb4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'b5a262126a489fc8db61acaf40f0593491e62e5c4222c5545f30a99549f987fb'}]}, 'timestamp': '2026-02-20 10:01:18.361590', '_unique_id': 'e951c0696c05409ab628459ccbc76d02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6d72a56-af11-4579-88f2-5788aa4307e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.364392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19ccd0e6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '87315e7022be5ec22d4a966aa4147a343a36c7f1fdcdd265ee53d22c2dd84992'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.364392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cce252-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '24254e51b70c7e7d49c79eb08195e60dd0882c91f7114c4f93e6607cfaab5828'}]}, 'timestamp': '2026-02-20 10:01:18.365397', '_unique_id': 'dd066772099241fa98ece0de0a00cd17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.367 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.367 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.368 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c42d48-e74d-4565-a66f-f83ccb4e3f61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.367701', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19cd4ff8-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '076c7f73906380475a7b4ebe24ef1eeeadcebce1ae7b95c1c1b07733c19bb8de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.367701', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cd607e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '66751623b0588b8c3aff6df1b25fb6864fedebc95c8e5733b9719c140bf42648'}]}, 'timestamp': '2026-02-20 10:01:18.368682', '_unique_id': '63da96fd5e834f8abeb82936f6d3074c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.370 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20f357e3-a487-41a5-988a-e2f22faa529b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.370933', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19cdce88-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '9cc17a93d06fef89893835990cf1d971bc6c6a7546ea7c32d54bab23dc8a2c3e'}]}, 'timestamp': '2026-02-20 10:01:18.371426', '_unique_id': 'a0d548b93f43414cacc0e4649cf94b00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.373 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e583d93-ecc0-4b3e-90b9-a781e5b1a57b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.373785', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19ce3f8a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '71cb6bd37cda3c1aa8c1c4b9b95e5af73e64110d77f5e2638759d54f33c2e520'}]}, 'timestamp': '2026-02-20 10:01:18.374325', '_unique_id': '7016c2d6f3c44bd2be60f74fa6993604'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.376 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.376 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '777ae504-7d3e-4530-832b-e3bf56f62268', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.376874', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19ceb690-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': 'd990bfe87c888eddbec86e1f2aee5198882b83a168ae1adecdf05735f6859ca1'}]}, 'timestamp': '2026-02-20 10:01:18.377415', '_unique_id': '0aac0b11e44f4912853d41e400a9f7cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.379 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.380 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '728e4541-d2c3-4cd4-9ab7-5efd98e45b3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.379884', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19cf2c06-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '30a9b0029950a77850f9d7e26fd22f0f299c0b103828a15ce689653b7bf092cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.379884', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cf3cc8-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'bb0bf2f40c1fded486f98d76c81879fba81de77c8502cb99e848d1d4c5238153'}]}, 'timestamp': '2026-02-20 10:01:18.380827', '_unique_id': '1ea80a1688e046c5b452206fdc40d586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.383 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '974d112f-b86a-4c33-9152-dc5ad90bfa24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.383038', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19cfa67c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '4e783cfe5259bfa2effb8fc868ada59038f423faa59577d7c2a35574ee2e3fdc'}]}, 'timestamp': '2026-02-20 10:01:18.383506', '_unique_id': '03acec0b5a6a4fbebf2f74b2d913f87f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.395 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.396 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be672f60-7189-4021-a2c9-ff510744b02f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.385657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d19a18-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': 'ddc71601dbab6a5b5ffeff7c057d5bc86e30ccda093cb6981e0106147525ca96'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.385657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d1ab48-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '5f41f31d8df78df448604c95d817b40fb4617e54f90c1c38314c71447b69c9fa'}]}, 'timestamp': '2026-02-20 10:01:18.396845', '_unique_id': 'a9f93066114045eaa5f99cb48e041027'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.399 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7af843c-a626-4541-8522-7640854138e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.399756', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d236e4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': 'a6ea6e53ce71a145e82cedcc098e88d012fba96d8e8a1a717869cdb3c904dd5a'}]}, 'timestamp': '2026-02-20 10:01:18.400392', '_unique_id': 'd1419ff9453f45e3a17995eec0da61ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.403 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a25ee31-3f7f-403b-a4d9-fefab8ab5af2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.402892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d2adf4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'e6224391fac70279059422152e04effa36950e2dd3442d018d3340a57d1638cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.402892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d2bde4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '05c082d8b153b9e6756de67cb9cce889c9921fb33a0f0eadaed57043d2ab35c6'}]}, 'timestamp': '2026-02-20 10:01:18.403766', '_unique_id': '698f7511a5df46d99d29421dcfbd6723'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.406 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.406 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c17d3bd3-8d42-42fe-8b9d-9dfcf57e2aa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.406030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d32892-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '70f136de0b70bbcc921fd367d466f24e98abd5d1c6631d02410d95295d971a53'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.406030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d3399a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '3822c1e081ad6128a972a423616f2ef054167b146ab39c9b9043314ef15480e2'}]}, 'timestamp': '2026-02-20 10:01:18.406903', '_unique_id': 'c4c331f009834f17be17b2f18bb928d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.408 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.423 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 19490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b48210cc-0f0b-4560-bda3-c0447455412a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19490000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:01:18.409011', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '19d5d7cc-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.662670307, 'message_signature': '32543b785726a3389c018482c3ce5129a68d625593a29b079f39b1d779f1f112'}]}, 'timestamp': '2026-02-20 10:01:18.424080', '_unique_id': '2a18c618657d465095bb199bfc2937c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.426 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.426 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.426 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35dcce18-1e32-4140-a9fc-9b1fc5768c01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:01:18.426326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '19d64126-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.662670307, 'message_signature': 'd32d41bf37fcbcf0dd4787fb142f2a96a46d6dbf141689ad59111726b4c157e7'}]}, 'timestamp': '2026-02-20 10:01:18.426804', '_unique_id': '6de8f8a3cec745b089055341735e193e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.428 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd47b0cd7-c847-4c8f-bc3e-1a452d5ae3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.428880', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d6a562-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '84ab443d8540a7e261a9c62893a36baf5ec09535f960d27f00a4171e4a3fc1b6'}]}, 'timestamp': '2026-02-20 10:01:18.429349', '_unique_id': '040a6fe23c3d45c9a03bb8659a47d22c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.431 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '090f1568-9ddd-423a-a585-fb489adc93a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.431508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d70cbe-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '5de362648e3bcc4f9b8ac5da2ec3194e0871817d8d42f0b03da69da925d31b42'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.431508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d71cc2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '3c24bcd018c4fa4afaae5b971892a9b7f0bc286d101917fee874af7da7c63c26'}]}, 'timestamp': '2026-02-20 10:01:18.432378', '_unique_id': '9acc2c78cd4f4f799395ab35ca392188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.434 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f58ec99-d937-4aaf-b1d7-308b521d1626', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.434615', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d786bc-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': 'ca801fc0cb8c423f9cc400f4137200bfb9f71a1eb27f9a119b766508e7697776'}]}, 'timestamp': '2026-02-20 10:01:18.435115', '_unique_id': 'e325c7b43af04b18b3634f0a12854be5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.437 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.437 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.438 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff97e4ba-e6e3-4957-9072-4e79bf9b8caa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.437761', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d80024-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '82c9301498c945c5b1147bb86fd1bd7c7192bff63b85887cbe55088a4c686d70'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.437761', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d8100a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '40bc3c16fa0d01b9daabd1fb5255f964625e0b95d26ccaeead6e13fcbf735dea'}]}, 'timestamp': '2026-02-20 10:01:18.438605', '_unique_id': '1d6c943342524338843b1cb2418c5098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.440 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c88ca35-d4a4-475d-8e88-5915a563825b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.440737', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d8748c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '31c09eed5822bf1dd51e3cd7f18a09c807e06d0a888009dbf8e87a3487f468d4'}]}, 'timestamp': '2026-02-20 10:01:18.441204', '_unique_id': '8606542adc944842bc58f0e9cd235848'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:01:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Feb 20 05:01:20 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 20 05:01:20 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 20 05:01:20 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Feb 20 05:01:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:22 localhost nova_compute[281288]: 2026-02-20 10:01:22.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:22 localhost nova_compute[281288]: 2026-02-20 10:01:22.420 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:22 localhost nova_compute[281288]: 2026-02-20 10:01:22.421 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:22 localhost nova_compute[281288]: 2026-02-20 10:01:22.421 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:22 localhost nova_compute[281288]: 2026-02-20 10:01:22.448 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:22 localhost nova_compute[281288]: 2026-02-20 10:01:22.449 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:01:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e251 e251: 6 total, 6 up, 6 in Feb 20 05:01:23 localhost podman[322338]: 2026-02-20 10:01:23.129872215 +0000 UTC m=+0.070116271 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 20 05:01:23 localhost podman[322338]: 2026-02-20 10:01:23.146313146 +0000 UTC m=+0.086557242 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:01:23 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:01:23 localhost ovn_controller[156798]: 2026-02-20T10:01:23Z|00407|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 20 05:01:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:26 localhost openstack_network_exporter[244414]: ERROR 10:01:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:01:26 localhost openstack_network_exporter[244414]: Feb 20 05:01:26 localhost openstack_network_exporter[244414]: ERROR 10:01:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:01:26 localhost openstack_network_exporter[244414]: Feb 20 05:01:27 localhost nova_compute[281288]: 2026-02-20 10:01:27.483 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:27 localhost nova_compute[281288]: 2026-02-20 10:01:27.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:27 localhost nova_compute[281288]: 2026-02-20 10:01:27.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:27 localhost nova_compute[281288]: 2026-02-20 10:01:27.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:27 localhost nova_compute[281288]: 2026-02-20 10:01:27.485 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:27 localhost nova_compute[281288]: 2026-02-20 10:01:27.488 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:01:30 localhost podman[322358]: 2026-02-20 10:01:30.87147616 +0000 UTC m=+0.087192482 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:01:30 localhost podman[322358]: 2026-02-20 10:01:30.909071426 +0000 UTC m=+0.124787748 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 05:01:30 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:01:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:32 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e252 e252: 6 total, 6 up, 6 in Feb 20 05:01:32 localhost nova_compute[281288]: 2026-02-20 10:01:32.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:33 localhost sshd[322382]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:01:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:01:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/3176bf32-687c-42ee-9751-803dd8a1fea3/40c90f7c-2187-4f07-808e-7e98f84f2bd9", "osd", "allow rw pool=manila_data namespace=fsvolumens_3176bf32-687c-42ee-9751-803dd8a1fea3", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:01:35 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/3176bf32-687c-42ee-9751-803dd8a1fea3/40c90f7c-2187-4f07-808e-7e98f84f2bd9", "osd", "allow rw pool=manila_data namespace=fsvolumens_3176bf32-687c-42ee-9751-803dd8a1fea3", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:01:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:01:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:01:37 localhost podman[322385]: 2026-02-20 10:01:37.126979572 +0000 UTC m=+0.059301071 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:01:37 localhost podman[322384]: 2026-02-20 10:01:37.203213498 +0000 UTC m=+0.137156426 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, release=1770267347, version=9.7, architecture=x86_64, container_name=openstack_network_exporter, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 20 05:01:37 localhost podman[322384]: 2026-02-20 10:01:37.213367527 +0000 UTC m=+0.147310475 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, version=9.7, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal) Feb 20 05:01:37 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:01:37 localhost podman[322385]: 2026-02-20 10:01:37.27440877 +0000 UTC m=+0.206730279 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:01:37 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:01:37 localhost nova_compute[281288]: 2026-02-20 10:01:37.491 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:37 localhost nova_compute[281288]: 2026-02-20 10:01:37.493 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:37 localhost nova_compute[281288]: 2026-02-20 10:01:37.493 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:37 localhost nova_compute[281288]: 2026-02-20 10:01:37.493 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:37 localhost nova_compute[281288]: 2026-02-20 10:01:37.539 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:37 localhost nova_compute[281288]: 2026-02-20 10:01:37.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:38 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e253 e253: 6 total, 6 up, 6 in Feb 20 05:01:39 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e254 e254: 6 total, 6 up, 6 in Feb 20 05:01:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:01:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:01:39 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:01:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:42 localhost nova_compute[281288]: 2026-02-20 10:01:42.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:42 localhost nova_compute[281288]: 2026-02-20 10:01:42.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:42 localhost nova_compute[281288]: 2026-02-20 10:01:42.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:42 localhost nova_compute[281288]: 2026-02-20 10:01:42.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:42 localhost nova_compute[281288]: 2026-02-20 10:01:42.580 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:42 localhost nova_compute[281288]: 2026-02-20 10:01:42.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:43 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 e255: 6 total, 6 up, 6 in Feb 20 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:01:45 localhost podman[322428]: 2026-02-20 10:01:45.155498411 +0000 UTC m=+0.086432587 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:01:45 localhost systemd[1]: tmp-crun.zXNtak.mount: Deactivated successfully. Feb 20 05:01:45 localhost podman[322429]: 2026-02-20 10:01:45.212065398 +0000 UTC m=+0.138104905 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3) Feb 20 05:01:45 localhost podman[322428]: 2026-02-20 10:01:45.224075284 +0000 UTC m=+0.155009500 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 05:01:45 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:01:45 localhost podman[322429]: 2026-02-20 10:01:45.245181578 +0000 UTC m=+0.171221065 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:01:45 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:01:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:01:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e325a34-23a3-4cc0-9c2e-4df32d08d36f/82f72010-739f-40e2-832b-6d6c7666a8fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:01:45 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e325a34-23a3-4cc0-9c2e-4df32d08d36f/82f72010-739f-40e2-832b-6d6c7666a8fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:01:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:47 localhost nova_compute[281288]: 2026-02-20 10:01:47.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:47 localhost nova_compute[281288]: 2026-02-20 10:01:47.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:47 localhost nova_compute[281288]: 2026-02-20 10:01:47.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:47 localhost nova_compute[281288]: 2026-02-20 10:01:47.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:47 localhost nova_compute[281288]: 2026-02-20 10:01:47.630 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:47 localhost nova_compute[281288]: 2026-02-20 10:01:47.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:47 localhost podman[241968]: time="2026-02-20T10:01:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:01:47 localhost podman[241968]: @ - - [20/Feb/2026:10:01:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:01:47 localhost podman[241968]: @ - - [20/Feb/2026:10:01:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18352 "" "Go-http-client/1.1" Feb 20 05:01:50 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 05:01:50 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:01:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:52 localhost sshd[322555]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:01:52 localhost nova_compute[281288]: 2026-02-20 10:01:52.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:52 localhost nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:01:52 localhost nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:01:52 localhost nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:52 localhost nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:52 localhost nova_compute[281288]: 2026-02-20 10:01:52.660 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:01:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:01:53 localhost podman[322557]: 2026-02-20 10:01:53.297282666 +0000 UTC m=+0.070150971 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 05:01:53 localhost podman[322557]: 2026-02-20 10:01:53.3370743 +0000 UTC m=+0.109942565 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 05:01:53 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:01:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:01:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:01:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:01:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:01:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:01:56 localhost openstack_network_exporter[244414]: ERROR 10:01:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:01:56 localhost openstack_network_exporter[244414]: Feb 20 05:01:56 localhost openstack_network_exporter[244414]: ERROR 10:01:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:01:56 localhost openstack_network_exporter[244414]: Feb 20 05:01:57 localhost sshd[322576]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:01:57 localhost nova_compute[281288]: 2026-02-20 10:01:57.661 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:57 localhost nova_compute[281288]: 2026-02-20 10:01:57.664 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:01:58 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1559371662", "format": "json"} : dispatch Feb 20 05:01:58 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1559371662", "caps": ["mds", "allow rw path=/volumes/_nogroup/50918518-d5fc-4598-a9e4-c7aeadda4e5c/790d1933-deae-4553-8b28-0d1b6f0fc428", "osd", "allow rw pool=manila_data namespace=fsvolumens_50918518-d5fc-4598-a9e4-c7aeadda4e5c", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:01:58 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1559371662", "caps": ["mds", "allow rw path=/volumes/_nogroup/50918518-d5fc-4598-a9e4-c7aeadda4e5c/790d1933-deae-4553-8b28-0d1b6f0fc428", "osd", "allow rw pool=manila_data namespace=fsvolumens_50918518-d5fc-4598-a9e4-c7aeadda4e5c", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:01:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1559371662", "format": "json"} : dispatch Feb 20 05:01:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1559371662"} : dispatch Feb 20 05:01:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1559371662"}]': finished Feb 20 05:02:00 localhost nova_compute[281288]: 2026-02-20 10:02:00.739 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:02:01 localhost systemd[1]: tmp-crun.IoefkG.mount: Deactivated successfully. Feb 20 05:02:01 localhost podman[322578]: 2026-02-20 10:02:01.146210005 +0000 UTC m=+0.085438138 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 05:02:01 localhost podman[322578]: 2026-02-20 10:02:01.158813459 +0000 UTC m=+0.098041572 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:02:01 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:02:01 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:01 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/b164674c-a82b-4878-a588-09120b66d1e5/72089254-fcf7-474f-aaf6-ba53e49ce9b2", "osd", "allow rw pool=manila_data namespace=fsvolumens_b164674c-a82b-4878-a588-09120b66d1e5", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:02:01 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/b164674c-a82b-4878-a588-09120b66d1e5/72089254-fcf7-474f-aaf6-ba53e49ce9b2", "osd", "allow rw pool=manila_data namespace=fsvolumens_b164674c-a82b-4878-a588-09120b66d1e5", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:02:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:02:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/644961451' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:02:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:02:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/644961451' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:02:02 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:02.572 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:02:02 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:02.573 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 05:02:02 localhost nova_compute[281288]: 2026-02-20 10:02:02.607 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:02 localhost nova_compute[281288]: 2026-02-20 10:02:02.662 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:02 localhost nova_compute[281288]: 2026-02-20 10:02:02.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:02 localhost nova_compute[281288]: 2026-02-20 10:02:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:04 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:04 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:02:04 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.767 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.767 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.768 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.768 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 05:02:04 localhost nova_compute[281288]: 2026-02-20 10:02:04.769 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:02:05 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:02:05 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2750696122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.218 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.280 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.281 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.479 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.481 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11230MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.481 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.481 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.709 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.797 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.933 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.933 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.953 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 20 05:02:05 localhost nova_compute[281288]: 2026-02-20 10:02:05.973 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 20 05:02:06 localhost nova_compute[281288]: 2026-02-20 10:02:06.008 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:02:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:06.024 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:02:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:06.024 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:02:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:06.025 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:02:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:02:06 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2507481714' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:02:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:06 localhost nova_compute[281288]: 2026-02-20 10:02:06.490 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:02:06 localhost nova_compute[281288]: 2026-02-20 10:02:06.497 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 05:02:06 localhost nova_compute[281288]: 2026-02-20 10:02:06.511 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 05:02:06 localhost nova_compute[281288]: 2026-02-20 10:02:06.514 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 05:02:06 localhost nova_compute[281288]: 2026-02-20 10:02:06.515 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:02:07 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:07 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:02:07 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.515 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.516 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.714 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.714 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.715 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.716 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:07 localhost nova_compute[281288]: 2026-02-20 10:02:07.717 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:02:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:02:08 localhost podman[322646]: 2026-02-20 10:02:08.142920602 +0000 UTC m=+0.079944399 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, architecture=x86_64, release=1770267347, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7) Feb 20 05:02:08 localhost podman[322646]: 2026-02-20 10:02:08.156760324 +0000 UTC m=+0.093784141 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-type=git, managed_by=edpm_ansible, version=9.7, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 20 05:02:08 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:02:08 localhost podman[322647]: 2026-02-20 10:02:08.205196202 +0000 UTC m=+0.136243737 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 05:02:08 localhost podman[322647]: 2026-02-20 10:02:08.216152296 +0000 UTC m=+0.147199881 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 05:02:08 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:02:08 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:08.576 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 05:02:08 localhost nova_compute[281288]: 2026-02-20 10:02:08.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:08 localhost nova_compute[281288]: 2026-02-20 10:02:08.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 05:02:09 localhost nova_compute[281288]: 2026-02-20 10:02:09.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:09 localhost nova_compute[281288]: 2026-02-20 10:02:09.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 05:02:09 localhost nova_compute[281288]: 2026-02-20 10:02:09.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 05:02:10 localhost nova_compute[281288]: 2026-02-20 10:02:10.047 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 05:02:10 localhost nova_compute[281288]: 2026-02-20 10:02:10.047 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 05:02:10 localhost nova_compute[281288]: 2026-02-20 10:02:10.048 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 05:02:10 localhost nova_compute[281288]: 2026-02-20 10:02:10.048 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 05:02:11 localhost nova_compute[281288]: 2026-02-20 10:02:11.406 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 05:02:11 localhost nova_compute[281288]: 2026-02-20 10:02:11.461 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 05:02:11 localhost nova_compute[281288]: 2026-02-20 10:02:11.462 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 05:02:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:12 localhost nova_compute[281288]: 2026-02-20 10:02:12.458 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:02:12 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:12 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:02:12 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:02:12 localhost nova_compute[281288]: 2026-02-20 10:02:12.717 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:12 localhost nova_compute[281288]: 2026-02-20 10:02:12.719 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:15 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:15 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:02:15 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:02:16 localhost podman[322690]: 2026-02-20 10:02:16.13766706 +0000 UTC m=+0.078406963 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:02:16 localhost podman[322691]: 2026-02-20 10:02:16.184372856 +0000 UTC m=+0.123885942 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:02:16 localhost podman[322690]: 2026-02-20 10:02:16.205146009 +0000 UTC m=+0.145885822 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Feb 20 05:02:16 localhost podman[322691]: 2026-02-20 10:02:16.215170515 +0000 UTC m=+0.154683591 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 20 05:02:16 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:02:16 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:02:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e256 e256: 6 total, 6 up, 6 in Feb 20 05:02:17 localhost podman[241968]: time="2026-02-20T10:02:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:02:17 localhost podman[241968]: @ - - [20/Feb/2026:10:02:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.720 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.721 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.723 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:17 localhost nova_compute[281288]: 2026-02-20 10:02:17.724 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:17 localhost podman[241968]: @ - - [20/Feb/2026:10:02:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1" Feb 20 05:02:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:02:18 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:02:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:22 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:22 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:02:22 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:02:22 localhost nova_compute[281288]: 2026-02-20 10:02:22.726 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:22 localhost nova_compute[281288]: 2026-02-20 10:02:22.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:22 localhost nova_compute[281288]: 2026-02-20 10:02:22.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:02:22 localhost nova_compute[281288]: 2026-02-20 10:02:22.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:22 localhost nova_compute[281288]: 2026-02-20 10:02:22.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:22 localhost nova_compute[281288]: 2026-02-20 10:02:22.747 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 e257: 6 total, 6 up, 6 in Feb 20 05:02:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:02:24 localhost systemd[1]: tmp-crun.na0Jjr.mount: Deactivated successfully. Feb 20 05:02:24 localhost podman[322732]: 2026-02-20 10:02:24.166999814 +0000 UTC m=+0.104518410 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 20 05:02:24 localhost podman[322732]: 2026-02-20 10:02:24.204375934 +0000 UTC m=+0.141894510 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 05:02:24 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:02:24 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:24.497 264355 INFO neutron.agent.linux.ip_lib [None req-ba4ff21f-2e79-43e0-b9d1-7b69c8ff3641 - - - - - -] Device tap01c64444-62 cannot be used as it has no MAC address#033[00m Feb 20 05:02:24 localhost nova_compute[281288]: 2026-02-20 10:02:24.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:24 localhost kernel: device tap01c64444-62 entered promiscuous mode Feb 20 05:02:24 localhost ovn_controller[156798]: 2026-02-20T10:02:24Z|00408|binding|INFO|Claiming lport 01c64444-6242-477e-b8a5-acc9b61d7649 for this chassis. Feb 20 05:02:24 localhost ovn_controller[156798]: 2026-02-20T10:02:24Z|00409|binding|INFO|01c64444-6242-477e-b8a5-acc9b61d7649: Claiming unknown Feb 20 05:02:24 localhost nova_compute[281288]: 2026-02-20 10:02:24.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:24 localhost NetworkManager[5988]: [1771581744.5850] manager: (tap01c64444-62): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Feb 20 05:02:24 localhost systemd-udevd[322761]: Network interface NamePolicy= disabled on kernel command line. Feb 20 05:02:24 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:24.596 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3d0b83eb9d040b2a1ee21f2d4ef3fce', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d227202d-dab9-4451-8037-c79279634b88, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=01c64444-6242-477e-b8a5-acc9b61d7649) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:02:24 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:24.599 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 01c64444-6242-477e-b8a5-acc9b61d7649 in datapath 0948d27a-4e54-4f2c-b484-b44317772f0a bound to our chassis#033[00m Feb 20 05:02:24 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:24.601 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fd0005c6-52cb-4f7c-a677-21aa510edd4c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 05:02:24 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:24.602 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0948d27a-4e54-4f2c-b484-b44317772f0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:02:24 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:24.603 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8546ceba-4c20-4ad9-9bc9-020a6f6311f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost ovn_controller[156798]: 2026-02-20T10:02:24Z|00410|binding|INFO|Setting lport 01c64444-6242-477e-b8a5-acc9b61d7649 ovn-installed in OVS Feb 20 05:02:24 localhost ovn_controller[156798]: 2026-02-20T10:02:24Z|00411|binding|INFO|Setting lport 01c64444-6242-477e-b8a5-acc9b61d7649 up in Southbound Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost nova_compute[281288]: 2026-02-20 10:02:24.628 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost journal[229984]: ethtool ioctl error on tap01c64444-62: No such device Feb 20 05:02:24 localhost nova_compute[281288]: 2026-02-20 10:02:24.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:24 localhost nova_compute[281288]: 2026-02-20 10:02:24.704 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:24 localhost nova_compute[281288]: 2026-02-20 10:02:24.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:25 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:25 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:02:25 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:02:25 localhost podman[322832]: Feb 20 05:02:25 localhost podman[322832]: 2026-02-20 10:02:25.727468705 +0000 UTC m=+0.094222155 container create 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:02:25 localhost podman[322832]: 2026-02-20 10:02:25.684739792 +0000 UTC m=+0.051493282 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 05:02:25 localhost systemd[1]: Started libpod-conmon-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7.scope. Feb 20 05:02:25 localhost systemd[1]: tmp-crun.JMgZx7.mount: Deactivated successfully. Feb 20 05:02:25 localhost systemd[1]: Started libcrun container. Feb 20 05:02:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e93c2b209248131595b33d1ed560347197deb6725ee1319908263122cb78ab7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 05:02:25 localhost podman[322832]: 2026-02-20 10:02:25.830605892 +0000 UTC m=+0.197359342 container init 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 05:02:25 localhost podman[322832]: 2026-02-20 10:02:25.840558286 +0000 UTC m=+0.207311726 container start 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:02:25 localhost dnsmasq[322851]: started, version 2.85 cachesize 150 Feb 20 05:02:25 localhost dnsmasq[322851]: DNS service limited to local subnets Feb 20 05:02:25 localhost dnsmasq[322851]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 05:02:25 localhost dnsmasq[322851]: warning: no upstream servers configured Feb 20 05:02:25 localhost dnsmasq-dhcp[322851]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 05:02:25 localhost dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 0 addresses Feb 20 05:02:25 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host Feb 20 05:02:25 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts Feb 20 05:02:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:25.912 264355 INFO neutron.agent.dhcp.agent [None req-6592b2d0-af6e-446e-a0bc-e8e1c28584e1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:02:24Z, description=, device_id=33ec31b2-fecf-477f-8148-61437b8399e4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca945224-72bd-424a-80e3-db458bb34395, ip_allocation=immediate, mac_address=fa:16:3e:ce:45:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:02:21Z, description=, dns_domain=, id=0948d27a-4e54-4f2c-b484-b44317772f0a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1085427905-network, port_security_enabled=True, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8987, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3715, status=ACTIVE, subnets=['7bc845c7-cd31-4b75-b7fe-a9e74c7b267a'], tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:22Z, vlan_transparent=None, network_id=0948d27a-4e54-4f2c-b484-b44317772f0a, port_security_enabled=False, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3723, status=DOWN, tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:25Z on network 0948d27a-4e54-4f2c-b484-b44317772f0a#033[00m Feb 20 05:02:25 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:25.964 264355 INFO neutron.agent.dhcp.agent [None req-eea541f1-666d-447c-af2e-8cd526a60bd9 - - - - - -] DHCP configuration for ports {'acbf0195-d8e8-4b8e-88cb-5f48f4fbccad'} is completed#033[00m Feb 20 05:02:26 localhost dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 1 addresses Feb 20 05:02:26 localhost podman[322869]: 2026-02-20 10:02:26.134306198 +0000 UTC m=+0.061482437 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 20 05:02:26 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host Feb 20 05:02:26 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts Feb 20 05:02:26 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:26.319 264355 INFO neutron.agent.dhcp.agent [None req-3e5402b6-cae1-4880-9ecd-45409f348abd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:02:24Z, description=, device_id=33ec31b2-fecf-477f-8148-61437b8399e4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca945224-72bd-424a-80e3-db458bb34395, ip_allocation=immediate, mac_address=fa:16:3e:ce:45:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:02:21Z, description=, dns_domain=, id=0948d27a-4e54-4f2c-b484-b44317772f0a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1085427905-network, port_security_enabled=True, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8987, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3715, status=ACTIVE, subnets=['7bc845c7-cd31-4b75-b7fe-a9e74c7b267a'], tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:22Z, vlan_transparent=None, network_id=0948d27a-4e54-4f2c-b484-b44317772f0a, port_security_enabled=False, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3723, status=DOWN, tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:25Z on network 0948d27a-4e54-4f2c-b484-b44317772f0a#033[00m Feb 20 05:02:26 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:26.397 264355 INFO neutron.agent.dhcp.agent [None req-c808fd63-e0d3-48ec-9c48-307b46aa9ff0 - - - - - -] DHCP configuration for ports {'ca945224-72bd-424a-80e3-db458bb34395'} is completed#033[00m Feb 20 05:02:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:26 localhost dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 1 addresses Feb 20 05:02:26 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host Feb 20 05:02:26 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts Feb 20 05:02:26 localhost podman[322908]: 2026-02-20 10:02:26.546458464 +0000 UTC m=+0.065360716 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 20 05:02:26 localhost openstack_network_exporter[244414]: ERROR 10:02:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:02:26 localhost openstack_network_exporter[244414]: Feb 20 05:02:26 localhost openstack_network_exporter[244414]: ERROR 10:02:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:02:26 localhost openstack_network_exporter[244414]: Feb 20 05:02:26 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:26.842 264355 INFO neutron.agent.dhcp.agent [None req-77e057be-5cba-42f8-82f6-5b331cedcdb2 - - - - - -] DHCP configuration for ports {'ca945224-72bd-424a-80e3-db458bb34395'} is completed#033[00m Feb 20 05:02:27 localhost nova_compute[281288]: 2026-02-20 10:02:27.747 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:27 localhost nova_compute[281288]: 2026-02-20 10:02:27.750 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch Feb 20 05:02:29 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished Feb 20 05:02:30 localhost sshd[322928]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:02:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:02:32 localhost podman[322929]: 2026-02-20 10:02:32.152376797 +0000 UTC m=+0.085804520 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 05:02:32 localhost podman[322929]: 2026-02-20 10:02:32.165043472 +0000 UTC m=+0.098471226 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 05:02:32 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:02:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch Feb 20 05:02:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch Feb 20 05:02:32 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished Feb 20 05:02:32 localhost nova_compute[281288]: 2026-02-20 10:02:32.751 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:32 localhost nova_compute[281288]: 2026-02-20 10:02:32.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:32 localhost nova_compute[281288]: 2026-02-20 10:02:32.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:02:32 localhost nova_compute[281288]: 2026-02-20 10:02:32.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:32 localhost nova_compute[281288]: 2026-02-20 10:02:32.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:32 localhost nova_compute[281288]: 2026-02-20 10:02:32.790 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:34 localhost dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 0 addresses Feb 20 05:02:34 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host Feb 20 05:02:34 localhost dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts Feb 20 05:02:34 localhost podman[322967]: 2026-02-20 10:02:34.145276862 +0000 UTC m=+0.064574961 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 20 05:02:34 localhost kernel: device tap01c64444-62 left promiscuous mode Feb 20 05:02:34 localhost nova_compute[281288]: 2026-02-20 10:02:34.371 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:34 localhost ovn_controller[156798]: 2026-02-20T10:02:34Z|00412|binding|INFO|Releasing lport 01c64444-6242-477e-b8a5-acc9b61d7649 from this chassis (sb_readonly=0) Feb 20 05:02:34 localhost ovn_controller[156798]: 2026-02-20T10:02:34Z|00413|binding|INFO|Setting lport 01c64444-6242-477e-b8a5-acc9b61d7649 down in Southbound Feb 20 05:02:34 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:34.380 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3d0b83eb9d040b2a1ee21f2d4ef3fce', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d227202d-dab9-4451-8037-c79279634b88, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=01c64444-6242-477e-b8a5-acc9b61d7649) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:02:34 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:34.381 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 01c64444-6242-477e-b8a5-acc9b61d7649 in datapath 0948d27a-4e54-4f2c-b484-b44317772f0a unbound from our chassis#033[00m Feb 20 05:02:34 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:34.384 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0948d27a-4e54-4f2c-b484-b44317772f0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:02:34 localhost ovn_metadata_agent[162647]: 2026-02-20 10:02:34.385 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[59e0f160-6a0e-4f67-98c6-92f5ed29b994]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:02:34 localhost nova_compute[281288]: 2026-02-20 10:02:34.393 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:34 localhost nova_compute[281288]: 2026-02-20 10:02:34.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:35 localhost ovn_controller[156798]: 2026-02-20T10:02:35Z|00414|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:02:35 localhost nova_compute[281288]: 2026-02-20 10:02:35.471 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:36 localhost dnsmasq[322851]: exiting on receipt of SIGTERM Feb 20 05:02:36 localhost podman[323007]: 2026-02-20 10:02:36.002410075 +0000 UTC m=+0.061172538 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 05:02:36 localhost systemd[1]: libpod-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7.scope: Deactivated successfully. Feb 20 05:02:36 localhost podman[323020]: 2026-02-20 10:02:36.075256158 +0000 UTC m=+0.055573847 container died 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 05:02:36 localhost systemd[1]: tmp-crun.YKA1tR.mount: Deactivated successfully. Feb 20 05:02:36 localhost podman[323020]: 2026-02-20 10:02:36.118792476 +0000 UTC m=+0.099110125 container cleanup 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 05:02:36 localhost systemd[1]: libpod-conmon-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7.scope: Deactivated successfully. Feb 20 05:02:36 localhost podman[323021]: 2026-02-20 10:02:36.196175607 +0000 UTC m=+0.171781092 container remove 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 05:02:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:36.228 264355 INFO neutron.agent.dhcp.agent [None req-892f4de4-d440-436e-8e71-26a59f2eae94 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:02:36 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:02:36.228 264355 INFO neutron.agent.dhcp.agent [None req-892f4de4-d440-436e-8e71-26a59f2eae94 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:02:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:37 localhost systemd[1]: var-lib-containers-storage-overlay-4e93c2b209248131595b33d1ed560347197deb6725ee1319908263122cb78ab7-merged.mount: Deactivated successfully. Feb 20 05:02:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7-userdata-shm.mount: Deactivated successfully. Feb 20 05:02:37 localhost systemd[1]: run-netns-qdhcp\x2d0948d27a\x2d4e54\x2d4f2c\x2db484\x2db44317772f0a.mount: Deactivated successfully. Feb 20 05:02:37 localhost nova_compute[281288]: 2026-02-20 10:02:37.824 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:02:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:02:39 localhost podman[323051]: 2026-02-20 10:02:39.161515573 +0000 UTC m=+0.096650690 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 05:02:39 localhost podman[323051]: 2026-02-20 10:02:39.203280447 +0000 UTC m=+0.138415614 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 20 05:02:39 localhost systemd[1]: tmp-crun.eUASep.mount: Deactivated successfully. Feb 20 05:02:39 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:02:39 localhost podman[323052]: 2026-02-20 10:02:39.226689621 +0000 UTC m=+0.158742004 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 05:02:39 localhost podman[323052]: 2026-02-20 10:02:39.26435275 +0000 UTC m=+0.196405193 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:02:39 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:02:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:42 localhost nova_compute[281288]: 2026-02-20 10:02:42.826 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:42 localhost nova_compute[281288]: 2026-02-20 10:02:42.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:42 localhost nova_compute[281288]: 2026-02-20 10:02:42.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:02:42 localhost nova_compute[281288]: 2026-02-20 10:02:42.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:42 localhost nova_compute[281288]: 2026-02-20 10:02:42.856 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:42 localhost nova_compute[281288]: 2026-02-20 10:02:42.857 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:02:43 localhost nova_compute[281288]: 2026-02-20 10:02:43.808 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:46 localhost nova_compute[281288]: 2026-02-20 10:02:46.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:46 localhost sshd[323094]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:02:47 localhost podman[323096]: 2026-02-20 10:02:47.097307882 +0000 UTC m=+0.087591914 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:02:47 localhost podman[323097]: 2026-02-20 10:02:47.156018323 +0000 UTC m=+0.141783747 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Feb 20 05:02:47 localhost podman[323096]: 2026-02-20 10:02:47.172148045 +0000 UTC m=+0.162432107 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 20 05:02:47 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:02:47 localhost podman[323097]: 2026-02-20 10:02:47.19230869 +0000 UTC m=+0.178074114 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:02:47 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:02:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e258 e258: 6 total, 6 up, 6 in Feb 20 05:02:47 localhost podman[241968]: time="2026-02-20T10:02:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:02:47 localhost podman[241968]: @ - - [20/Feb/2026:10:02:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:02:47 localhost podman[241968]: @ - - [20/Feb/2026:10:02:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18353 "" "Go-http-client/1.1" Feb 20 05:02:47 localhost nova_compute[281288]: 2026-02-20 10:02:47.889 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:48 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e259 e259: 6 total, 6 up, 6 in Feb 20 05:02:48 localhost ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Feb 20 05:02:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:51 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 05:02:51 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:02:52 localhost ovn_controller[156798]: 2026-02-20T10:02:52Z|00415|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:02:52 localhost nova_compute[281288]: 2026-02-20 10:02:52.226 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:52 localhost nova_compute[281288]: 2026-02-20 10:02:52.925 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:02:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e260 e260: 6 total, 6 up, 6 in Feb 20 05:02:54 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e261 e261: 6 total, 6 up, 6 in Feb 20 05:02:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:02:55 localhost podman[323225]: 2026-02-20 10:02:55.151846555 +0000 UTC m=+0.084702136 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:02:55 localhost podman[323225]: 2026-02-20 10:02:55.164924954 +0000 UTC m=+0.097780485 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 20 05:02:55 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:02:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e262 e262: 6 total, 6 up, 6 in Feb 20 05:02:56 localhost openstack_network_exporter[244414]: ERROR 10:02:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:02:56 localhost openstack_network_exporter[244414]: Feb 20 05:02:56 localhost openstack_network_exporter[244414]: ERROR 10:02:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:02:56 localhost openstack_network_exporter[244414]: Feb 20 05:02:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:02:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e263 e263: 6 total, 6 up, 6 in Feb 20 05:02:57 localhost nova_compute[281288]: 2026-02-20 10:02:57.927 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:02:58 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e264 e264: 6 total, 6 up, 6 in Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.223325) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780223374, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2576, "num_deletes": 259, "total_data_size": 4391198, "memory_usage": 4562016, "flush_reason": "Manual Compaction"} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780237126, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2872263, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29270, "largest_seqno": 31841, "table_properties": {"data_size": 2861951, "index_size": 6369, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25539, "raw_average_key_size": 22, "raw_value_size": 2839917, "raw_average_value_size": 2478, "num_data_blocks": 273, "num_entries": 1146, "num_filter_entries": 1146, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581663, "oldest_key_time": 1771581663, "file_creation_time": 1771581780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 13860 microseconds, and 7323 cpu microseconds. Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.237180) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2872263 bytes OK Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.237210) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.239469) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.239492) EVENT_LOG_v1 {"time_micros": 1771581780239485, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.239518) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 4378955, prev total WAL file size 4378955, number of live WAL files 2. Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.240725) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2804KB)], [45(17MB)] Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780240775, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 20918796, "oldest_snapshot_seqno": -1} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 14308 keys, 19489804 bytes, temperature: kUnknown Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780326751, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 19489804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19406886, "index_size": 46081, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35781, "raw_key_size": 381426, "raw_average_key_size": 26, "raw_value_size": 19162839, "raw_average_value_size": 1339, "num_data_blocks": 1735, "num_entries": 14308, "num_filter_entries": 14308, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.327114) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 19489804 bytes Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.329374) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 243.0 rd, 226.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 17.2 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(14.1) write-amplify(6.8) OK, records in: 14849, records dropped: 541 output_compression: NoCompression Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.329409) EVENT_LOG_v1 {"time_micros": 1771581780329391, "job": 26, "event": "compaction_finished", "compaction_time_micros": 86093, "compaction_time_cpu_micros": 53580, "output_level": 6, "num_output_files": 1, "total_output_size": 19489804, "num_input_records": 14849, "num_output_records": 14308, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780330004, "job": 26, "event": "table_file_deletion", "file_number": 47} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780332810, "job": 26, "event": "table_file_deletion", "file_number": 45} Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.240596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332924) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:00 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:01 localhost nova_compute[281288]: 2026-02-20 10:03:01.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e265 e265: 6 total, 6 up, 6 in Feb 20 05:03:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:03:02 localhost podman[323244]: 2026-02-20 10:03:02.65452737 +0000 UTC m=+0.089957156 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:03:02 localhost podman[323244]: 2026-02-20 10:03:02.669214648 +0000 UTC m=+0.104644464 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 05:03:02 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:03:02 localhost nova_compute[281288]: 2026-02-20 10:03:02.959 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:02 localhost nova_compute[281288]: 2026-02-20 10:03:02.960 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:02 localhost nova_compute[281288]: 2026-02-20 10:03:02.961 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:03:02 localhost nova_compute[281288]: 2026-02-20 10:03:02.961 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:02 localhost nova_compute[281288]: 2026-02-20 10:03:02.965 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:02 localhost nova_compute[281288]: 2026-02-20 10:03:02.965 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:03 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e266 e266: 6 total, 6 up, 6 in Feb 20 05:03:03 localhost nova_compute[281288]: 2026-02-20 10:03:03.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:05 localhost nova_compute[281288]: 2026-02-20 10:03:05.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:05 localhost nova_compute[281288]: 2026-02-20 10:03:05.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:03:05 localhost nova_compute[281288]: 2026-02-20 10:03:05.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:03:05 localhost nova_compute[281288]: 2026-02-20 10:03:05.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:03:05 localhost nova_compute[281288]: 2026-02-20 10:03:05.747 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 05:03:05 localhost nova_compute[281288]: 2026-02-20 10:03:05.747 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:03:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:06.026 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:03:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:03:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:06.028 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:03:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:03:06 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1846256840' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.184 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.249 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.250 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.490 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.493 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11205MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.493 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.494 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.565 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.566 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.566 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 05:03:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:06 localhost nova_compute[281288]: 2026-02-20 10:03:06.597 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:03:07 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:03:07 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2235935279' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.051 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.057 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.079 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.081 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.082 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:03:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:07.091 264355 INFO neutron.agent.linux.ip_lib [None req-313a2c9a-588b-4ef7-9f9b-edba28ce1bc5 - - - - - -] Device tap736aea51-80 cannot be used as it has no MAC address#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.114 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:07 localhost kernel: device tap736aea51-80 entered promiscuous mode Feb 20 05:03:07 localhost NetworkManager[5988]: [1771581787.1205] manager: (tap736aea51-80): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Feb 20 05:03:07 localhost ovn_controller[156798]: 2026-02-20T10:03:07Z|00416|binding|INFO|Claiming lport 736aea51-8061-4aa8-b593-31839f1f2534 for this chassis. Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.123 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:07 localhost ovn_controller[156798]: 2026-02-20T10:03:07Z|00417|binding|INFO|736aea51-8061-4aa8-b593-31839f1f2534: Claiming unknown Feb 20 05:03:07 localhost systemd-udevd[323322]: Network interface NamePolicy= disabled on kernel command line. Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.131 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a47e12e114b4e778ff94aca4e5dad8b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88fdeb8d-ebba-4734-9a0b-e4eba3120811, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=736aea51-8061-4aa8-b593-31839f1f2534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.133 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 736aea51-8061-4aa8-b593-31839f1f2534 in datapath fd32aaea-98e9-4dfa-ad52-36d30939560e bound to our chassis#033[00m Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.135 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 30127f4b-6d7e-49a7-8ad3-e1d1e43df164 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.135 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd32aaea-98e9-4dfa-ad52-36d30939560e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.136 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d9648fb4-ae99-438a-973e-97da56528624]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost ovn_controller[156798]: 2026-02-20T10:03:07Z|00418|binding|INFO|Setting lport 736aea51-8061-4aa8-b593-31839f1f2534 ovn-installed in OVS Feb 20 05:03:07 localhost ovn_controller[156798]: 2026-02-20T10:03:07Z|00419|binding|INFO|Setting lport 736aea51-8061-4aa8-b593-31839f1f2534 up in Southbound Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.170 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost journal[229984]: ethtool ioctl error on tap736aea51-80: No such device Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.213 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.243 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.738 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:03:07 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:07.740 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.774 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:07 localhost nova_compute[281288]: 2026-02-20 10:03:07.971 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:08 localhost nova_compute[281288]: 2026-02-20 10:03:08.002 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:08 localhost nova_compute[281288]: 2026-02-20 10:03:08.082 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:08 localhost nova_compute[281288]: 2026-02-20 10:03:08.082 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:08 localhost nova_compute[281288]: 2026-02-20 10:03:08.083 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:08 localhost podman[323394]: Feb 20 05:03:08 localhost podman[323394]: 2026-02-20 10:03:08.137164131 +0000 UTC m=+0.087233233 container create 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:03:08 localhost podman[323394]: 2026-02-20 10:03:08.087258278 +0000 UTC m=+0.037327410 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 05:03:08 localhost systemd[1]: Started libpod-conmon-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf.scope. Feb 20 05:03:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e267 e267: 6 total, 6 up, 6 in Feb 20 05:03:08 localhost systemd[1]: Started libcrun container. Feb 20 05:03:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/febc23c981ad25c134ca1cbca59507084ea6219789b24f92013e13e04132bbc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 05:03:08 localhost podman[323394]: 2026-02-20 10:03:08.226379263 +0000 UTC m=+0.176448365 container init 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:03:08 localhost podman[323394]: 2026-02-20 10:03:08.238015908 +0000 UTC m=+0.188085050 container start 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:03:08 localhost systemd[1]: tmp-crun.Ne9YZP.mount: Deactivated successfully. Feb 20 05:03:08 localhost dnsmasq[323412]: started, version 2.85 cachesize 150 Feb 20 05:03:08 localhost dnsmasq[323412]: DNS service limited to local subnets Feb 20 05:03:08 localhost dnsmasq[323412]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 05:03:08 localhost dnsmasq[323412]: warning: no upstream servers configured Feb 20 05:03:08 localhost dnsmasq-dhcp[323412]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 05:03:08 localhost dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 0 addresses Feb 20 05:03:08 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host Feb 20 05:03:08 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts Feb 20 05:03:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:08.391 264355 INFO neutron.agent.dhcp.agent [None req-d1d35632-d433-4be7-a245-09c192536fe6 - - - - - -] DHCP configuration for ports {'1344df8e-cf17-4eaf-a761-77690035f079'} is completed#033[00m Feb 20 05:03:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:08.411 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:08Z, description=, device_id=91bb17b9-dbc2-4da3-ba37-b6215b1cc229, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=96452e5e-5c94-4f20-aee2-527568a64031, ip_allocation=immediate, mac_address=fa:16:3e:5f:7d:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:05Z, description=, dns_domain=, id=fd32aaea-98e9-4dfa-ad52-36d30939560e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-664160433-network, port_security_enabled=True, project_id=0a47e12e114b4e778ff94aca4e5dad8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15575, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3833, status=ACTIVE, subnets=['17f91002-6186-43f7-bdd4-354cdf443dac'], tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:05Z, vlan_transparent=None, network_id=fd32aaea-98e9-4dfa-ad52-36d30939560e, port_security_enabled=False, project_id=0a47e12e114b4e778ff94aca4e5dad8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3841, status=DOWN, tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:08Z on network fd32aaea-98e9-4dfa-ad52-36d30939560e#033[00m Feb 20 05:03:08 localhost dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 1 addresses Feb 20 05:03:08 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host Feb 20 05:03:08 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts Feb 20 05:03:08 localhost podman[323429]: 2026-02-20 10:03:08.649347238 +0000 UTC m=+0.070408529 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 05:03:08 localhost nova_compute[281288]: 2026-02-20 10:03:08.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:08 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:08.945 264355 INFO neutron.agent.dhcp.agent [None req-227c1349-320a-4a90-897d-8015e4b38232 - - - - - -] DHCP configuration for ports {'96452e5e-5c94-4f20-aee2-527568a64031'} is completed#033[00m Feb 20 05:03:09 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:09.110 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:08Z, description=, device_id=91bb17b9-dbc2-4da3-ba37-b6215b1cc229, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=96452e5e-5c94-4f20-aee2-527568a64031, ip_allocation=immediate, mac_address=fa:16:3e:5f:7d:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:05Z, description=, dns_domain=, id=fd32aaea-98e9-4dfa-ad52-36d30939560e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-664160433-network, port_security_enabled=True, project_id=0a47e12e114b4e778ff94aca4e5dad8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15575, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3833, status=ACTIVE, subnets=['17f91002-6186-43f7-bdd4-354cdf443dac'], tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:05Z, vlan_transparent=None, network_id=fd32aaea-98e9-4dfa-ad52-36d30939560e, port_security_enabled=False, project_id=0a47e12e114b4e778ff94aca4e5dad8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3841, status=DOWN, tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:08Z on network fd32aaea-98e9-4dfa-ad52-36d30939560e#033[00m Feb 20 05:03:09 localhost systemd[1]: tmp-crun.IeLgkq.mount: Deactivated successfully. Feb 20 05:03:09 localhost dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 1 addresses Feb 20 05:03:09 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host Feb 20 05:03:09 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts Feb 20 05:03:09 localhost podman[323466]: 2026-02-20 10:03:09.317110142 +0000 UTC m=+0.062107136 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 05:03:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:03:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:03:09 localhost podman[323479]: 2026-02-20 10:03:09.447671796 +0000 UTC m=+0.093559506 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter) Feb 20 05:03:09 localhost podman[323479]: 2026-02-20 10:03:09.48646511 +0000 UTC m=+0.132352790 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, io.buildah.version=1.33.7, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 05:03:09 localhost podman[323480]: 2026-02-20 10:03:09.503317674 +0000 UTC m=+0.147783560 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:03:09 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:03:09 localhost podman[323480]: 2026-02-20 10:03:09.535739883 +0000 UTC m=+0.180205789 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 05:03:09 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:03:09 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:09.587 264355 INFO neutron.agent.dhcp.agent [None req-8697ac76-559c-4a0f-b2d5-736492b4e829 - - - - - -] DHCP configuration for ports {'96452e5e-5c94-4f20-aee2-527568a64031'} is completed#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.805 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.806 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.806 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 05:03:09 localhost nova_compute[281288]: 2026-02-20 10:03:09.807 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 05:03:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:11 localhost nova_compute[281288]: 2026-02-20 10:03:11.834 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 05:03:11 localhost nova_compute[281288]: 2026-02-20 10:03:11.865 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 05:03:11 localhost nova_compute[281288]: 2026-02-20 10:03:11.865 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 05:03:11 localhost nova_compute[281288]: 2026-02-20 10:03:11.866 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:03:11 localhost nova_compute[281288]: 2026-02-20 10:03:11.866 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 05:03:12 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e268 e268: 6 total, 6 up, 6 in Feb 20 05:03:12 localhost nova_compute[281288]: 2026-02-20 10:03:12.969 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:12 localhost nova_compute[281288]: 2026-02-20 10:03:12.975 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:13 localhost dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 0 addresses Feb 20 05:03:13 localhost podman[323544]: 2026-02-20 10:03:13.452805586 +0000 UTC m=+0.051874233 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 20 05:03:13 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host Feb 20 05:03:13 localhost dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts Feb 20 05:03:13 localhost ovn_controller[156798]: 2026-02-20T10:03:13Z|00420|binding|INFO|Releasing lport 736aea51-8061-4aa8-b593-31839f1f2534 from this chassis (sb_readonly=0) Feb 20 05:03:13 localhost kernel: device tap736aea51-80 left promiscuous mode Feb 20 05:03:13 localhost ovn_controller[156798]: 2026-02-20T10:03:13Z|00421|binding|INFO|Setting lport 736aea51-8061-4aa8-b593-31839f1f2534 down in Southbound Feb 20 05:03:13 localhost nova_compute[281288]: 2026-02-20 10:03:13.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:13 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:13.964 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a47e12e114b4e778ff94aca4e5dad8b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88fdeb8d-ebba-4734-9a0b-e4eba3120811, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=736aea51-8061-4aa8-b593-31839f1f2534) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:03:13 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:13.966 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 736aea51-8061-4aa8-b593-31839f1f2534 in datapath fd32aaea-98e9-4dfa-ad52-36d30939560e unbound from our chassis#033[00m Feb 20 05:03:13 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:13.968 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd32aaea-98e9-4dfa-ad52-36d30939560e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:03:13 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:13.969 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1cefd803-5a48-4c8a-a809-7f0518704e6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:03:13 localhost nova_compute[281288]: 2026-02-20 10:03:13.973 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:15 localhost ovn_controller[156798]: 2026-02-20T10:03:15Z|00422|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:03:15 localhost nova_compute[281288]: 2026-02-20 10:03:15.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:16 localhost dnsmasq[323412]: exiting on receipt of SIGTERM Feb 20 05:03:16 localhost podman[323585]: 2026-02-20 10:03:16.332540671 +0000 UTC m=+0.050443990 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 05:03:16 localhost systemd[1]: libpod-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf.scope: Deactivated successfully. Feb 20 05:03:16 localhost podman[323599]: 2026-02-20 10:03:16.403543428 +0000 UTC m=+0.051676008 container died 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 20 05:03:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf-userdata-shm.mount: Deactivated successfully. Feb 20 05:03:16 localhost systemd[1]: var-lib-containers-storage-overlay-febc23c981ad25c134ca1cbca59507084ea6219789b24f92013e13e04132bbc8-merged.mount: Deactivated successfully. Feb 20 05:03:16 localhost podman[323599]: 2026-02-20 10:03:16.440260618 +0000 UTC m=+0.088393188 container remove 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 20 05:03:16 localhost systemd[1]: libpod-conmon-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf.scope: Deactivated successfully. Feb 20 05:03:16 localhost systemd[1]: run-netns-qdhcp\x2dfd32aaea\x2d98e9\x2d4dfa\x2dad52\x2d36d30939560e.mount: Deactivated successfully. Feb 20 05:03:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:16.473 264355 INFO neutron.agent.dhcp.agent [None req-fa505e1f-dc27-493d-91a7-862d9d01285a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:03:16 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:16.473 264355 INFO neutron.agent.dhcp.agent [None req-fa505e1f-dc27-493d-91a7-862d9d01285a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:03:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:17 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e269 e269: 6 total, 6 up, 6 in Feb 20 05:03:17 localhost podman[241968]: time="2026-02-20T10:03:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:03:17 localhost podman[241968]: @ - - [20/Feb/2026:10:03:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:03:17 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:17.743 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 05:03:17 localhost podman[241968]: @ - - [20/Feb/2026:10:03:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18351 "" "Go-http-client/1.1" Feb 20 05:03:18 localhost nova_compute[281288]: 2026-02-20 10:03:18.010 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:03:18 localhost podman[323623]: 2026-02-20 10:03:18.159328849 +0000 UTC m=+0.098158957 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 05:03:18 localhost podman[323623]: 2026-02-20 10:03:18.212012536 +0000 UTC m=+0.150842594 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 20 05:03:18 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e270 e270: 6 total, 6 up, 6 in Feb 20 05:03:18 localhost systemd[1]: tmp-crun.7x7yGd.mount: Deactivated successfully. Feb 20 05:03:18 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:03:18 localhost podman[323624]: 2026-02-20 10:03:18.229802719 +0000 UTC m=+0.164656745 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 20 05:03:18 localhost podman[323624]: 2026-02-20 10:03:18.265427506 +0000 UTC m=+0.200281522 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 05:03:18 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.322 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702c5655-bec4-4378-82ee-bc0ccec7c4bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.324176', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '614e0c82-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '41b36c66105151bfab5a0726ed4b087b84dfa11439cdf84af9425ea6d6884788'}]}, 'timestamp': '2026-02-20 10:03:18.330272', '_unique_id': 'e595ff8bf216475aa1471b7e1a644149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.333 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a567cfa8-5142-40d5-8400-c52cb6cb74bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:03:18.334088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '61521502-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.594674855, 'message_signature': '05e3ba206579408ccc88daad9f27719333fa847ec0361cfb695cb23e1598e153'}]}, 'timestamp': '2026-02-20 10:03:18.356744', '_unique_id': 'ce6f8791c6964841992e5343f7d60a28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.394 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.395 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e38618-2f01-4190-abe2-7c8e3fb63b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.360016', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615800a2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '72a34631e2e96741f340407ab1ed7c7c73f83bbd7dc5ff5661df2d263d813f22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.360016', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615817f4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '3955404c7f61e8b190069987391f9b35b0d2548145fbf3e731c04845bb099041'}]}, 'timestamp': '2026-02-20 10:03:18.396006', '_unique_id': 'f34a009c24724655a5890b9fe5867019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.399 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.410 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.411 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d7f5c5-30ad-42b0-9fef-4580968b5273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.399342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615a713e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': '65424c96d37bfa61c6d761bb8c8fe9adf7326266c359ac85a28d1f1747af41ed'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.399342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615a82aa-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': 'eede08b5d9574db5b0ee3e1eb945a20cf458d905213b69c0ab3e368ef78bbe78'}]}, 'timestamp': '2026-02-20 10:03:18.411788', '_unique_id': '869278872f6e4d3b96c3e5d7c778e4df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.415 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4972adcd-d0f7-43e5-9430-890eeee6d117', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.414499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615b052c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': '7e98f2203ab3082462e5216b4a2da961b9c59b1919c0d1db3d23a6de808ae606'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.414499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615b1a12-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': 'cdc8fed0f3eb9860d36dc4dc04192aa3872ddeace97b83b826b5e816d576fb1e'}]}, 'timestamp': '2026-02-20 10:03:18.415667', '_unique_id': 'c72f9b606e0d4b79a56a2da6cce24c90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.418 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.418 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef75ff5a-0339-4c90-87bf-7a259070333a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.418315', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615b95aa-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '8acd775f90acc89a025e1b5b692c2cc532c0cb7b72da854d99287f479ca1738f'}]}, 'timestamp': '2026-02-20 10:03:18.418888', '_unique_id': '9bf4afeeaab647789981ac9d70df32de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.421 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.421 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d73f51e-aaf8-4b8c-adf4-59cf1810d00f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.421869', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615c1fa2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': 'ca789ca7422d532a43a11ba9caa5d8d1e079398109a0d63ef452318a65dee0cd'}]}, 'timestamp': '2026-02-20 10:03:18.422350', '_unique_id': '9a875e2e41cd42d5a9419e5657822387'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.424 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.425 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daa3636c-bfd5-4791-9e31-4eff6923e5cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.424748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615c91e4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': '5e40d349ffb559cd04394865bb739f5e2e9360d828ea7943bc453ba0dffa8de2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.424748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615ca36e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': 'eec9087cedf3e84fc57c888d52af1276ac3a0abad5b68d61b7eb9c62c72534ed'}]}, 'timestamp': '2026-02-20 10:03:18.425724', '_unique_id': '6340bf86769c4192bd77fd12e67c663b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.428 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a3b5c40-0fc5-4c4a-be72-47b4ac9257e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.428266', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615d19a2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '0abab6b30c8d8713e7f430691754f0f1641d502fff37d7f0ddaf05654390b248'}]}, 'timestamp': '2026-02-20 10:03:18.428789', '_unique_id': 'b85d4a2d678b474a91d5aa92017b84be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.430 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd34adab-3c89-4c14-9cd1-3a448df0c562', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.430929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615d811c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'e6b4898baec9da6696b5d87ecb2243b4c29a512290d643563650b8bcd95dd3a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.430929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615d922e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '21a5478d68dc783ab09914790578d79297cc555fe97f92c0cf1fab4486c0f440'}]}, 'timestamp': '2026-02-20 10:03:18.431854', '_unique_id': '3b879691680b4d059e0e599e40b9bc9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.434 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88249f59-6a2e-4d8a-9947-eba10480b2d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.434258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615e0470-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'd4733aec1d9eba5a5b1450cb82202b5fadae798b65d77e3aa19e75fbf60254f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.434258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615e18b6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '7cc443dfc06e2b791e219613d718b336619e666d9da6567acf9a2dd49ab79624'}]}, 'timestamp': '2026-02-20 10:03:18.435249', '_unique_id': '11a8f9afe7894172af94b7eb86d65701'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.437 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b757f67-f1f5-47fc-8cfb-e672d35e1e95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.437509', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615e83d2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': 'a7279612f286b56a4d2170cc72e38f41440c03c270c0809cc7bec0da036dd183'}]}, 'timestamp': '2026-02-20 10:03:18.438066', '_unique_id': '5e22831bb09045d4940386d1242c0c6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ede0b88c-d828-49f6-ae99-6c8c8c2bf588', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.440474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615ef7ea-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'c088c47c56bb13117e2173d9e45b09470c314b97aeac049370ccb66b384b3d76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.440474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615f08d4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'b99741234e6f2c01a08b6b10b802c805f9c90b7924b8e7a9952a281c1968542c'}]}, 'timestamp': '2026-02-20 10:03:18.441391', '_unique_id': '4bc8984b05474964ac9331876a158d87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.443 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '198c6523-bc7c-4517-9680-a60ffe97530b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.443583', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615f7120-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '01366cae0aaa048058f5562f339f37a16843bb768295e38e5c6cec237992048f'}]}, 'timestamp': '2026-02-20 10:03:18.444127', '_unique_id': 'c7588b7aa632416ca538e961aba42d35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.446 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.446 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c8ef41b-6add-489c-a35d-354528167a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.446486', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615fe240-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '05ee99959dc8695a83d4bbe7d0a2d96bf2a6f70aad6ba15d052cc46312927152'}]}, 'timestamp': '2026-02-20 10:03:18.446996', '_unique_id': '235f8704b1b843dca50430940faeebac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.449 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6090d9a-d4e9-40fe-aee5-e6d5e662deeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.449275', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '61604c94-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '1b3c822981a5ed36f23b2070c668f3e8f23676dbdb27657dac1bfcc5ca784ece'}]}, 'timestamp': '2026-02-20 10:03:18.449716', '_unique_id': 'de3ecf2cdb2d4fffb485efdf0f0e3c6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.451 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1381914-1d71-4164-aaac-b35650146146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.451185', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6160951e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': 'a7c68bf53eccadd0c70cbe24afd2638f55f39d06434faa8f1ab7e057a17f4bad'}]}, 'timestamp': '2026-02-20 10:03:18.451480', '_unique_id': '63d10638e7ec4ef09533a1d4df1e67c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '777dfd6c-7b7f-4821-b8d4-cb155d321a24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.453044', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6160ddbc-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '60f8059c3a265aa2686bd5183708015f83abcaf4ed7ddd350acebbe9b2d0fc58'}]}, 'timestamp': '2026-02-20 10:03:18.453338', '_unique_id': '771b01dd4a174a0fbee8f217de9062cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.454 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6e73b26-29c4-4943-ad09-9413b4645149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.454745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6161202e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'a72c4eb45fc1b94136b9928659a1ed9a56aed8ee1b16fa2b607db1cb07335aa4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.454745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61612af6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '47cbb9c15d58381e666561742218fdc9102f1c11d8d2428ac61780dd8de6bb36'}]}, 'timestamp': '2026-02-20 10:03:18.455296', '_unique_id': '1d68fba5c4294c409d87825d2543f653'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.456 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.456 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 20130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b048b44-6563-4583-8e2e-57174a74924c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20130000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:03:18.456733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '61616e8a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.594674855, 'message_signature': '1ac0fca5e1b8d7768cdd047fae873ebea9d91e4a4c79e801671e0f32985dd274'}]}, 'timestamp': '2026-02-20 10:03:18.457049', '_unique_id': '40001bbf432543428ab60b137cc5d1bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.458 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.458 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7431d514-41eb-4d0d-b976-a0f1c2a4bc0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.458423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6161b1a6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '84da3c2c7d49d75016ffdd4ac4b16960e5186e88e06998043d335f991445379e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.458423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6161bfde-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '125735c0c6e700cba44d13ddbc960d1e7586cfb74b6ce936114c8bb0ffe604fc'}]}, 'timestamp': '2026-02-20 10:03:18.459112', '_unique_id': '5e196f9b3b464ed69bb52384863c934b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Feb 20 05:03:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.460 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:03:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:23 localhost nova_compute[281288]: 2026-02-20 10:03:23.012 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:23 localhost nova_compute[281288]: 2026-02-20 10:03:23.014 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:23 localhost nova_compute[281288]: 2026-02-20 10:03:23.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:03:23 localhost nova_compute[281288]: 2026-02-20 10:03:23.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:23 localhost nova_compute[281288]: 2026-02-20 10:03:23.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:23 localhost nova_compute[281288]: 2026-02-20 10:03:23.047 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:23 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 e271: 6 total, 6 up, 6 in Feb 20 05:03:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:03:26 localhost systemd[1]: tmp-crun.3jxZ8N.mount: Deactivated successfully. Feb 20 05:03:26 localhost podman[323666]: 2026-02-20 10:03:26.15878203 +0000 UTC m=+0.098164866 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 05:03:26 localhost podman[323666]: 2026-02-20 10:03:26.172151609 +0000 UTC m=+0.111534435 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:03:26 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:03:26 localhost openstack_network_exporter[244414]: ERROR 10:03:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:03:26 localhost openstack_network_exporter[244414]: Feb 20 05:03:26 localhost openstack_network_exporter[244414]: ERROR 10:03:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:03:26 localhost openstack_network_exporter[244414]: Feb 20 05:03:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:28 localhost nova_compute[281288]: 2026-02-20 10:03:28.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:28 localhost nova_compute[281288]: 2026-02-20 10:03:28.049 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:28 localhost nova_compute[281288]: 2026-02-20 10:03:28.049 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:03:28 localhost nova_compute[281288]: 2026-02-20 10:03:28.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:28 localhost nova_compute[281288]: 2026-02-20 10:03:28.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:28 localhost nova_compute[281288]: 2026-02-20 10:03:28.055 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:33 localhost nova_compute[281288]: 2026-02-20 10:03:33.054 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:33 localhost nova_compute[281288]: 2026-02-20 10:03:33.055 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:33 localhost nova_compute[281288]: 2026-02-20 10:03:33.056 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:03:33 localhost nova_compute[281288]: 2026-02-20 10:03:33.056 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:03:33 localhost nova_compute[281288]: 2026-02-20 10:03:33.074 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:33 localhost nova_compute[281288]: 2026-02-20 10:03:33.074 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:33 localhost podman[323685]: 2026-02-20 10:03:33.166786412 +0000 UTC m=+0.080828527 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:03:33 localhost podman[323685]: 2026-02-20 10:03:33.202873353 +0000 UTC m=+0.116915428 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 05:03:33 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:03:33 localhost sshd[323708]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:03:34 localhost auditd[725]: Audit daemon rotating log files Feb 20 05:03:36 localhost sshd[323711]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:03:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e272 e272: 6 total, 6 up, 6 in Feb 20 05:03:38 localhost nova_compute[281288]: 2026-02-20 10:03:38.075 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:38 localhost nova_compute[281288]: 2026-02-20 10:03:38.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:38 localhost nova_compute[281288]: 2026-02-20 10:03:38.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:03:38 localhost nova_compute[281288]: 2026-02-20 10:03:38.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:38 localhost nova_compute[281288]: 2026-02-20 10:03:38.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:38 localhost nova_compute[281288]: 2026-02-20 10:03:38.125 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:03:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:03:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:03:40 localhost podman[323713]: 2026-02-20 10:03:40.1484618 +0000 UTC m=+0.086887952 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 20 05:03:40 localhost podman[323713]: 2026-02-20 10:03:40.161692014 +0000 UTC m=+0.100118186 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 05:03:40 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:03:40 localhost systemd[1]: tmp-crun.67b3Ka.mount: Deactivated successfully. Feb 20 05:03:40 localhost podman[323714]: 2026-02-20 10:03:40.209234835 +0000 UTC m=+0.145706457 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 20 05:03:40 localhost podman[323714]: 2026-02-20 10:03:40.241479838 +0000 UTC m=+0.177951460 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:03:40 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:03:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:43 localhost nova_compute[281288]: 2026-02-20 10:03:43.127 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:43 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e273 e273: 6 total, 6 up, 6 in Feb 20 05:03:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:47 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e274 e274: 6 total, 6 up, 6 in Feb 20 05:03:47 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:47.633 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:03:47 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:47.634 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 20 05:03:47 localhost nova_compute[281288]: 2026-02-20 10:03:47.673 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:47 localhost podman[241968]: time="2026-02-20T10:03:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:03:47 localhost podman[241968]: @ - - [20/Feb/2026:10:03:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:03:47 localhost podman[241968]: @ - - [20/Feb/2026:10:03:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18357 "" "Go-http-client/1.1" Feb 20 05:03:48 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:48.045 264355 INFO neutron.agent.linux.ip_lib [None req-cbf568c4-6016-453f-a7ca-c6d359eb2604 - - - - - -] Device tap579a4ffd-e1 cannot be used as it has no MAC address#033[00m Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.076 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:48 localhost kernel: device tap579a4ffd-e1 entered promiscuous mode Feb 20 05:03:48 localhost NetworkManager[5988]: [1771581828.0864] manager: (tap579a4ffd-e1): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Feb 20 05:03:48 localhost ovn_controller[156798]: 2026-02-20T10:03:48Z|00423|binding|INFO|Claiming lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 for this chassis. Feb 20 05:03:48 localhost ovn_controller[156798]: 2026-02-20T10:03:48Z|00424|binding|INFO|579a4ffd-e114-452c-9e7e-536d8c8914c1: Claiming unknown Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:48 localhost systemd-udevd[323766]: Network interface NamePolicy= disabled on kernel command line. Feb 20 05:03:48 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:48.099 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb651850cad14d76bf9ffb2d11fd8747', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9249ec89-4893-4da0-9067-aa2693e86932, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=579a4ffd-e114-452c-9e7e-536d8c8914c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:03:48 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:48.101 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 579a4ffd-e114-452c-9e7e-536d8c8914c1 in datapath 7e429db3-d8ae-4504-93c9-b3bbe8cdc007 bound to our chassis#033[00m Feb 20 05:03:48 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:48.103 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port b6450798-e8b9-4b7d-a648-348e62ed609f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 05:03:48 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:48.104 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:03:48 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:48.105 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d0dedf49-2b5d-4579-9812-70bf0ac86481]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:03:48 localhost ovn_controller[156798]: 2026-02-20T10:03:48Z|00425|binding|INFO|Setting lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 ovn-installed in OVS Feb 20 05:03:48 localhost ovn_controller[156798]: 2026-02-20T10:03:48Z|00426|binding|INFO|Setting lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 up in Southbound Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.132 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost journal[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.174 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.207 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:48 localhost ovn_controller[156798]: 2026-02-20T10:03:48Z|00427|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:03:48 localhost nova_compute[281288]: 2026-02-20 10:03:48.261 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:03:49 localhost podman[323831]: 2026-02-20 10:03:49.168585175 +0000 UTC m=+0.097503536 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 05:03:49 localhost podman[323848]: Feb 20 05:03:49 localhost podman[323831]: 2026-02-20 10:03:49.205288075 +0000 UTC m=+0.134206436 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 20 05:03:49 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:03:49 localhost podman[323830]: 2026-02-20 10:03:49.217692153 +0000 UTC m=+0.150340258 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 20 05:03:49 localhost podman[323848]: 2026-02-20 10:03:49.24349936 +0000 UTC m=+0.141440376 container create c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 20 05:03:49 localhost podman[323848]: 2026-02-20 10:03:49.153200225 +0000 UTC m=+0.051141231 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 05:03:49 localhost systemd[1]: Started libpod-conmon-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023.scope. Feb 20 05:03:49 localhost systemd[1]: Started libcrun container. Feb 20 05:03:49 localhost podman[323830]: 2026-02-20 10:03:49.330918978 +0000 UTC m=+0.263567153 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 20 05:03:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77fa0a05beec35518c693185a2da6b468c236c000516b316ebb93a07ea0c1c2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 05:03:49 localhost podman[323848]: 2026-02-20 10:03:49.342222193 +0000 UTC m=+0.240163209 container init c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:03:49 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:03:49 localhost podman[323848]: 2026-02-20 10:03:49.352609659 +0000 UTC m=+0.250550675 container start c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:03:49 localhost dnsmasq[323894]: started, version 2.85 cachesize 150 Feb 20 05:03:49 localhost dnsmasq[323894]: DNS service limited to local subnets Feb 20 05:03:49 localhost dnsmasq[323894]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 05:03:49 localhost dnsmasq[323894]: warning: no upstream servers configured Feb 20 05:03:49 localhost dnsmasq-dhcp[323894]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 05:03:49 localhost dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 0 addresses Feb 20 05:03:49 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host Feb 20 05:03:49 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts Feb 20 05:03:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:49.413 264355 INFO neutron.agent.dhcp.agent [None req-b4eb6337-31e1-438e-8f4b-281f20c8fb2e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:48Z, description=, device_id=589aa915-f12b-4442-ae0f-97795f57950f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=16cd30e7-a2a9-4140-b0a6-49e3b9576f4a, ip_allocation=immediate, mac_address=fa:16:3e:b6:56:5b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:45Z, description=, dns_domain=, id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-194178913-network, port_security_enabled=True, project_id=bb651850cad14d76bf9ffb2d11fd8747, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35672, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3892, status=ACTIVE, subnets=['ded50358-efae-40fd-8857-2510da215a44'], tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:46Z, vlan_transparent=None, network_id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, port_security_enabled=False, project_id=bb651850cad14d76bf9ffb2d11fd8747, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3900, status=DOWN, tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:48Z on network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007#033[00m Feb 20 05:03:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:49.521 264355 INFO neutron.agent.dhcp.agent [None req-384d23a2-7f09-4bd3-af9f-73ce37256037 - - - - - -] DHCP configuration for ports {'ce127046-9f8a-4325-9a56-0ecbb6db009d'} is completed#033[00m Feb 20 05:03:49 localhost dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 1 addresses Feb 20 05:03:49 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host Feb 20 05:03:49 localhost podman[323911]: 2026-02-20 10:03:49.640517554 +0000 UTC m=+0.060591800 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 20 05:03:49 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts Feb 20 05:03:49 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:49.882 264355 INFO neutron.agent.dhcp.agent [None req-cd70430f-aa50-4275-9258-a481487fc887 - - - - - -] DHCP configuration for ports {'16cd30e7-a2a9-4140-b0a6-49e3b9576f4a'} is completed#033[00m Feb 20 05:03:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:50.127 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:48Z, description=, device_id=589aa915-f12b-4442-ae0f-97795f57950f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=16cd30e7-a2a9-4140-b0a6-49e3b9576f4a, ip_allocation=immediate, mac_address=fa:16:3e:b6:56:5b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:45Z, description=, dns_domain=, id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-194178913-network, port_security_enabled=True, project_id=bb651850cad14d76bf9ffb2d11fd8747, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35672, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3892, status=ACTIVE, subnets=['ded50358-efae-40fd-8857-2510da215a44'], tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:46Z, vlan_transparent=None, network_id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, port_security_enabled=False, project_id=bb651850cad14d76bf9ffb2d11fd8747, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3900, status=DOWN, tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:48Z on network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007#033[00m Feb 20 05:03:50 localhost dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 1 addresses Feb 20 05:03:50 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host Feb 20 05:03:50 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts Feb 20 05:03:50 localhost podman[323947]: 2026-02-20 10:03:50.349903608 +0000 UTC m=+0.064551951 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:03:50 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:03:50.645 264355 INFO neutron.agent.dhcp.agent [None req-65891de8-6f13-46bf-ac3a-0dfd19d5bb62 - - - - - -] DHCP configuration for ports {'16cd30e7-a2a9-4140-b0a6-49e3b9576f4a'} is completed#033[00m Feb 20 05:03:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:52 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 05:03:52 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:03:53 localhost nova_compute[281288]: 2026-02-20 10:03:53.135 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:53 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 e275: 6 total, 6 up, 6 in Feb 20 05:03:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:03:56 localhost openstack_network_exporter[244414]: ERROR 10:03:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:03:56 localhost openstack_network_exporter[244414]: Feb 20 05:03:56 localhost openstack_network_exporter[244414]: ERROR 10:03:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:03:56 localhost openstack_network_exporter[244414]: Feb 20 05:03:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:03:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:03:57 localhost systemd[1]: tmp-crun.qiMFqC.mount: Deactivated successfully. Feb 20 05:03:57 localhost podman[324055]: 2026-02-20 10:03:57.175717811 +0000 UTC m=+0.101424185 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 20 05:03:57 localhost podman[324055]: 2026-02-20 10:03:57.215369881 +0000 UTC m=+0.141076215 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute) Feb 20 05:03:57 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:03:57 localhost ovn_metadata_agent[162647]: 2026-02-20 10:03:57.636 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 20 05:03:58 localhost nova_compute[281288]: 2026-02-20 10:03:58.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.253854) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838253916, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1193, "num_deletes": 265, "total_data_size": 1782068, "memory_usage": 1804352, "flush_reason": "Manual Compaction"} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838262677, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1169299, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31846, "largest_seqno": 33034, "table_properties": {"data_size": 1164271, "index_size": 2435, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12322, "raw_average_key_size": 20, "raw_value_size": 1153574, "raw_average_value_size": 1935, "num_data_blocks": 106, "num_entries": 596, "num_filter_entries": 596, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581781, "oldest_key_time": 1771581781, "file_creation_time": 1771581838, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 8865 microseconds, and 4209 cpu microseconds. Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.262728) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1169299 bytes OK Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.262754) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.264726) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.264746) EVENT_LOG_v1 {"time_micros": 1771581838264740, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.264770) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1776068, prev total WAL file size 1776392, number of live WAL files 2. Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.265487) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323732' seq:72057594037927935, type:22 .. '6C6F676D0034353234' seq:0, type:0; will stop at (end) Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1141KB)], [48(18MB)] Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838265535, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 20659103, "oldest_snapshot_seqno": -1} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 14356 keys, 20442829 bytes, temperature: kUnknown Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838360503, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 20442829, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20358151, "index_size": 47713, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35909, "raw_key_size": 383773, "raw_average_key_size": 26, "raw_value_size": 20111830, "raw_average_value_size": 1400, "num_data_blocks": 1798, "num_entries": 14356, "num_filter_entries": 14356, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581838, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.360916) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 20442829 bytes Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.362735) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.2 rd, 214.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 18.6 +0.0 blob) out(19.5 +0.0 blob), read-write-amplify(35.2) write-amplify(17.5) OK, records in: 14904, records dropped: 548 output_compression: NoCompression Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.362764) EVENT_LOG_v1 {"time_micros": 1771581838362751, "job": 28, "event": "compaction_finished", "compaction_time_micros": 95133, "compaction_time_cpu_micros": 53506, "output_level": 6, "num_output_files": 1, "total_output_size": 20442829, "num_input_records": 14904, "num_output_records": 14356, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838363069, "job": 28, "event": "table_file_deletion", "file_number": 50} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838365956, "job": 28, "event": "table_file_deletion", "file_number": 48} Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.265363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:03:59 localhost nova_compute[281288]: 2026-02-20 10:03:59.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:01 localhost nova_compute[281288]: 2026-02-20 10:04:01.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:02 localhost nova_compute[281288]: 2026-02-20 10:04:02.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:03 localhost nova_compute[281288]: 2026-02-20 10:04:03.142 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:04:04 localhost podman[324074]: 2026-02-20 10:04:04.173839102 +0000 UTC m=+0.101684283 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 20 05:04:04 localhost podman[324074]: 2026-02-20 10:04:04.213315456 +0000 UTC m=+0.141160577 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 20 05:04:04 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:04:04 localhost dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 0 addresses Feb 20 05:04:04 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host Feb 20 05:04:04 localhost dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts Feb 20 05:04:04 localhost podman[324112]: 2026-02-20 10:04:04.296799564 +0000 UTC m=+0.063623913 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 05:04:04 localhost ovn_controller[156798]: 2026-02-20T10:04:04Z|00428|binding|INFO|Releasing lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 from this chassis (sb_readonly=0) Feb 20 05:04:04 localhost ovn_controller[156798]: 2026-02-20T10:04:04Z|00429|binding|INFO|Setting lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 down in Southbound Feb 20 05:04:04 localhost kernel: device tap579a4ffd-e1 left promiscuous mode Feb 20 05:04:04 localhost nova_compute[281288]: 2026-02-20 10:04:04.553 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:04 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:04.561 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb651850cad14d76bf9ffb2d11fd8747', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9249ec89-4893-4da0-9067-aa2693e86932, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=579a4ffd-e114-452c-9e7e-536d8c8914c1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:04:04 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:04.563 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 579a4ffd-e114-452c-9e7e-536d8c8914c1 in datapath 7e429db3-d8ae-4504-93c9-b3bbe8cdc007 unbound from our chassis#033[00m Feb 20 05:04:04 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:04.566 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:04:04 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:04.567 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[45b8b7a7-3151-47f3-9df4-a4d89b144dbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:04:04 localhost nova_compute[281288]: 2026-02-20 10:04:04.580 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:05 localhost nova_compute[281288]: 2026-02-20 10:04:05.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:04:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:04:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:06.029 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:04:06 localhost ovn_controller[156798]: 2026-02-20T10:04:06Z|00430|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:04:06 localhost nova_compute[281288]: 2026-02-20 10:04:06.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:06 localhost nova_compute[281288]: 2026-02-20 10:04:06.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:07 localhost dnsmasq[323894]: exiting on receipt of SIGTERM Feb 20 05:04:07 localhost systemd[1]: libpod-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023.scope: Deactivated successfully. Feb 20 05:04:07 localhost podman[324153]: 2026-02-20 10:04:07.487382192 +0000 UTC m=+0.067869332 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 20 05:04:07 localhost podman[324167]: 2026-02-20 10:04:07.564907918 +0000 UTC m=+0.067735099 container died c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 20 05:04:07 localhost podman[324167]: 2026-02-20 10:04:07.609018113 +0000 UTC m=+0.111845254 container cleanup c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:07 localhost systemd[1]: libpod-conmon-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023.scope: Deactivated successfully. Feb 20 05:04:07 localhost podman[324174]: 2026-02-20 10:04:07.646220649 +0000 UTC m=+0.135217118 container remove c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 20 05:04:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:07.677 264355 INFO neutron.agent.dhcp.agent [None req-4bbc60d1-999e-4405-8002-325e8e505b6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:04:07 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:07.678 264355 INFO neutron.agent.dhcp.agent [None req-4bbc60d1-999e-4405-8002-325e8e505b6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.745 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 05:04:07 localhost nova_compute[281288]: 2026-02-20 10:04:07.745 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:04:07 localhost sshd[324198]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.177 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:08 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:04:08 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3556237270' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.199 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.272 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.273 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:04:08 localhost systemd[1]: tmp-crun.j9DUt1.mount: Deactivated successfully. Feb 20 05:04:08 localhost systemd[1]: var-lib-containers-storage-overlay-77fa0a05beec35518c693185a2da6b468c236c000516b316ebb93a07ea0c1c2b-merged.mount: Deactivated successfully. Feb 20 05:04:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023-userdata-shm.mount: Deactivated successfully. Feb 20 05:04:08 localhost systemd[1]: run-netns-qdhcp\x2d7e429db3\x2dd8ae\x2d4504\x2d93c9\x2db3bbe8cdc007.mount: Deactivated successfully. Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.508 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.511 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11204MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.511 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.512 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.588 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.589 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.589 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 05:04:08 localhost nova_compute[281288]: 2026-02-20 10:04:08.640 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:04:09 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:04:09 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3887188390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:04:09 localhost nova_compute[281288]: 2026-02-20 10:04:09.067 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:04:09 localhost nova_compute[281288]: 2026-02-20 10:04:09.074 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 05:04:09 localhost nova_compute[281288]: 2026-02-20 10:04:09.095 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 05:04:09 localhost nova_compute[281288]: 2026-02-20 10:04:09.097 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 05:04:09 localhost nova_compute[281288]: 2026-02-20 10:04:09.098 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:04:09 localhost sshd[324243]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:04:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:04:11 localhost systemd[1]: tmp-crun.YIv1uB.mount: Deactivated successfully. Feb 20 05:04:11 localhost podman[324245]: 2026-02-20 10:04:11.165913568 +0000 UTC m=+0.103325104 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=) Feb 20 05:04:11 localhost ovn_controller[156798]: 2026-02-20T10:04:11Z|00431|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:04:11 localhost podman[324246]: 2026-02-20 10:04:11.208905159 +0000 UTC m=+0.142694905 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 20 05:04:11 localhost podman[324246]: 2026-02-20 10:04:11.225782934 +0000 UTC m=+0.159572630 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 05:04:11 localhost podman[324245]: 2026-02-20 10:04:11.238045099 +0000 UTC m=+0.175456605 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 20 05:04:11 localhost nova_compute[281288]: 2026-02-20 10:04:11.239 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:11 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:04:11 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:04:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.094 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.095 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.095 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.095 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.185 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.186 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.186 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.187 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.796 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.815 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.816 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.817 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:12 localhost nova_compute[281288]: 2026-02-20 10:04:12.818 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 05:04:13 localhost nova_compute[281288]: 2026-02-20 10:04:13.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:14 localhost nova_compute[281288]: 2026-02-20 10:04:14.440 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:04:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:17 localhost podman[241968]: time="2026-02-20T10:04:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:04:17 localhost podman[241968]: @ - - [20/Feb/2026:10:04:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:04:17 localhost podman[241968]: @ - - [20/Feb/2026:10:04:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1" Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.214 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.214 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:04:18 localhost sshd[324287]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:18 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:18.539 264355 INFO neutron.agent.linux.ip_lib [None req-a31bd127-1bf3-4a93-afef-73b69b90945b - - - - - -] Device tapd32edf6e-af cannot be used as it has no MAC address#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.566 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:18 localhost kernel: device tapd32edf6e-af entered promiscuous mode Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:18 localhost ovn_controller[156798]: 2026-02-20T10:04:18Z|00432|binding|INFO|Claiming lport d32edf6e-af3c-4889-8b25-6677d97e68ca for this chassis. Feb 20 05:04:18 localhost ovn_controller[156798]: 2026-02-20T10:04:18Z|00433|binding|INFO|d32edf6e-af3c-4889-8b25-6677d97e68ca: Claiming unknown Feb 20 05:04:18 localhost NetworkManager[5988]: [1771581858.5783] manager: (tapd32edf6e-af): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Feb 20 05:04:18 localhost systemd-udevd[324299]: Network interface NamePolicy= disabled on kernel command line. Feb 20 05:04:18 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:18.585 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf2a5acf56b14171a5a2864e56a6776f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bfdb76f-0f89-4a13-995e-55a12ac2e6c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d32edf6e-af3c-4889-8b25-6677d97e68ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:04:18 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:18.586 162652 INFO neutron.agent.ovn.metadata.agent [-] Port d32edf6e-af3c-4889-8b25-6677d97e68ca in datapath 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa bound to our chassis#033[00m Feb 20 05:04:18 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:18.589 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5aa4082c-2d79-46f5-890f-3fe4407bf387 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 20 05:04:18 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:18.589 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:04:18 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:18.591 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7b41d26b-adda-46c5-af12-b179a9c6550b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:04:18 localhost ovn_controller[156798]: 2026-02-20T10:04:18Z|00434|binding|INFO|Setting lport d32edf6e-af3c-4889-8b25-6677d97e68ca ovn-installed in OVS Feb 20 05:04:18 localhost ovn_controller[156798]: 2026-02-20T10:04:18Z|00435|binding|INFO|Setting lport d32edf6e-af3c-4889-8b25-6677d97e68ca up in Southbound Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:18 localhost sshd[324307]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:18 localhost nova_compute[281288]: 2026-02-20 10:04:18.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:19 localhost podman[324354]: Feb 20 05:04:19 localhost podman[324354]: 2026-02-20 10:04:19.633893326 +0000 UTC m=+0.099550579 container create d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 20 05:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:04:19 localhost podman[324354]: 2026-02-20 10:04:19.584304652 +0000 UTC m=+0.049961946 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 20 05:04:19 localhost systemd[1]: Started libpod-conmon-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e.scope. Feb 20 05:04:19 localhost systemd[1]: Started libcrun container. Feb 20 05:04:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49096323e1fb7225bf70181bb6d0806fa8b81d0cddf9541ad891a72b23a8852e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 20 05:04:19 localhost podman[324354]: 2026-02-20 10:04:19.729844983 +0000 UTC m=+0.195502226 container init d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:19 localhost podman[324354]: 2026-02-20 10:04:19.738717464 +0000 UTC m=+0.204374707 container start d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:19 localhost dnsmasq[324392]: started, version 2.85 cachesize 150 Feb 20 05:04:19 localhost dnsmasq[324392]: DNS service limited to local subnets Feb 20 05:04:19 localhost dnsmasq[324392]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 20 05:04:19 localhost dnsmasq[324392]: warning: no upstream servers configured Feb 20 05:04:19 localhost dnsmasq-dhcp[324392]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 20 05:04:19 localhost dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 0 addresses Feb 20 05:04:19 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host Feb 20 05:04:19 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts Feb 20 05:04:19 localhost podman[324369]: 2026-02-20 10:04:19.801119237 +0000 UTC m=+0.110179782 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 05:04:19 localhost podman[324369]: 2026-02-20 10:04:19.836113816 +0000 UTC m=+0.145174411 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 20 05:04:19 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:04:19 localhost podman[324368]: 2026-02-20 10:04:19.851958649 +0000 UTC m=+0.167200892 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 20 05:04:19 localhost podman[324368]: 2026-02-20 10:04:19.916096746 +0000 UTC m=+0.231338979 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127) Feb 20 05:04:19 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:04:19 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:19.929 264355 INFO neutron.agent.dhcp.agent [None req-080ebf1a-31de-4eee-8d19-78a05a4476e4 - - - - - -] DHCP configuration for ports {'2c135f3f-0ca4-4c6b-9c27-1389cfad3242'} is completed#033[00m Feb 20 05:04:20 localhost nova_compute[281288]: 2026-02-20 10:04:20.135 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:20 localhost systemd[1]: tmp-crun.Mgdatz.mount: Deactivated successfully. Feb 20 05:04:20 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:20.645 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:04:20Z, description=, device_id=e57870c6-94a1-474d-9937-954c4e871cf2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f7a85e65-2443-4667-95de-2ffc5f519eac, ip_allocation=immediate, mac_address=fa:16:3e:d6:73:42, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:04:16Z, description=, dns_domain=, id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1219578190-network, port_security_enabled=True, project_id=cf2a5acf56b14171a5a2864e56a6776f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28209, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3934, status=ACTIVE, subnets=['257e5086-6db1-4f5c-9f51-784fc979c3a4'], tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:17Z, vlan_transparent=None, network_id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, port_security_enabled=False, project_id=cf2a5acf56b14171a5a2864e56a6776f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3942, status=DOWN, tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:20Z on network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa#033[00m Feb 20 05:04:20 localhost podman[324433]: 2026-02-20 10:04:20.901215923 +0000 UTC m=+0.093617877 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 20 05:04:20 localhost dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 1 addresses Feb 20 05:04:20 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host Feb 20 05:04:20 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts Feb 20 05:04:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:21.181 264355 INFO neutron.agent.dhcp.agent [None req-1ea73423-3f5c-426f-8d76-7b1747981187 - - - - - -] DHCP configuration for ports {'f7a85e65-2443-4667-95de-2ffc5f519eac'} is completed#033[00m Feb 20 05:04:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:21.360 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:04:20Z, description=, device_id=e57870c6-94a1-474d-9937-954c4e871cf2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f7a85e65-2443-4667-95de-2ffc5f519eac, ip_allocation=immediate, mac_address=fa:16:3e:d6:73:42, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:04:16Z, description=, dns_domain=, id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1219578190-network, port_security_enabled=True, project_id=cf2a5acf56b14171a5a2864e56a6776f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28209, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3934, status=ACTIVE, subnets=['257e5086-6db1-4f5c-9f51-784fc979c3a4'], tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:17Z, vlan_transparent=None, network_id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, port_security_enabled=False, project_id=cf2a5acf56b14171a5a2864e56a6776f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3942, status=DOWN, tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:20Z on network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa#033[00m Feb 20 05:04:21 localhost podman[324473]: 2026-02-20 10:04:21.564579053 +0000 UTC m=+0.060724574 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 20 05:04:21 localhost dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 1 addresses Feb 20 05:04:21 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host Feb 20 05:04:21 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts Feb 20 05:04:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:21 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:21.936 264355 INFO neutron.agent.dhcp.agent [None req-8fc83402-ddf5-466b-a0e5-3d5fe3ddf4b9 - - - - - -] DHCP configuration for ports {'f7a85e65-2443-4667-95de-2ffc5f519eac'} is completed#033[00m Feb 20 05:04:22 localhost sshd[324495]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:23 localhost nova_compute[281288]: 2026-02-20 10:04:23.249 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:26 localhost openstack_network_exporter[244414]: ERROR 10:04:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:04:26 localhost openstack_network_exporter[244414]: Feb 20 05:04:26 localhost openstack_network_exporter[244414]: ERROR 10:04:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:04:26 localhost openstack_network_exporter[244414]: Feb 20 05:04:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:27 localhost nova_compute[281288]: 2026-02-20 10:04:27.173 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:04:28 localhost systemd[1]: tmp-crun.m03X9m.mount: Deactivated successfully. Feb 20 05:04:28 localhost podman[324497]: 2026-02-20 10:04:28.169938261 +0000 UTC m=+0.110728350 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:28 localhost podman[324497]: 2026-02-20 10:04:28.180458292 +0000 UTC m=+0.121248391 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 20 05:04:28 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:04:28 localhost nova_compute[281288]: 2026-02-20 10:04:28.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:32 localhost nova_compute[281288]: 2026-02-20 10:04:32.037 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:33 localhost nova_compute[281288]: 2026-02-20 10:04:33.286 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:34 localhost sshd[324517]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:04:35 localhost podman[324519]: 2026-02-20 10:04:35.16118649 +0000 UTC m=+0.086470878 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 05:04:35 localhost podman[324519]: 2026-02-20 10:04:35.198962623 +0000 UTC m=+0.124247001 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:04:35 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:04:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e276 e276: 6 total, 6 up, 6 in Feb 20 05:04:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:37 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e277 e277: 6 total, 6 up, 6 in Feb 20 05:04:37 localhost sshd[324541]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:38 localhost nova_compute[281288]: 2026-02-20 10:04:38.288 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:40 localhost dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 0 addresses Feb 20 05:04:40 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host Feb 20 05:04:40 localhost dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts Feb 20 05:04:40 localhost podman[324560]: 2026-02-20 10:04:40.460422796 +0000 UTC m=+0.057880947 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:40 localhost ovn_controller[156798]: 2026-02-20T10:04:40Z|00436|binding|INFO|Releasing lport d32edf6e-af3c-4889-8b25-6677d97e68ca from this chassis (sb_readonly=0) Feb 20 05:04:40 localhost kernel: device tapd32edf6e-af left promiscuous mode Feb 20 05:04:40 localhost ovn_controller[156798]: 2026-02-20T10:04:40Z|00437|binding|INFO|Setting lport d32edf6e-af3c-4889-8b25-6677d97e68ca down in Southbound Feb 20 05:04:40 localhost nova_compute[281288]: 2026-02-20 10:04:40.689 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:40 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:40.698 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf2a5acf56b14171a5a2864e56a6776f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bfdb76f-0f89-4a13-995e-55a12ac2e6c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d32edf6e-af3c-4889-8b25-6677d97e68ca) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 20 05:04:40 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:40.702 162652 INFO neutron.agent.ovn.metadata.agent [-] Port d32edf6e-af3c-4889-8b25-6677d97e68ca in datapath 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa unbound from our chassis#033[00m Feb 20 05:04:40 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:40.704 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 20 05:04:40 localhost ovn_metadata_agent[162647]: 2026-02-20 10:04:40.709 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[32d4053b-7f00-41a3-9e5b-13d54db54837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 20 05:04:40 localhost nova_compute[281288]: 2026-02-20 10:04:40.713 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:41 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:04:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:04:42 localhost podman[324583]: 2026-02-20 10:04:42.145063736 +0000 UTC m=+0.079567038 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9/ubi-minimal) Feb 20 05:04:42 localhost podman[324583]: 2026-02-20 10:04:42.161063614 +0000 UTC m=+0.095566846 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.) Feb 20 05:04:42 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:04:42 localhost podman[324584]: 2026-02-20 10:04:42.254544227 +0000 UTC m=+0.186947655 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:04:42 localhost podman[324584]: 2026-02-20 10:04:42.262441737 +0000 UTC m=+0.194845165 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 20 05:04:42 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:04:42 localhost ovn_controller[156798]: 2026-02-20T10:04:42Z|00438|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:04:42 localhost nova_compute[281288]: 2026-02-20 10:04:42.811 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:43 localhost dnsmasq[324392]: exiting on receipt of SIGTERM Feb 20 05:04:43 localhost podman[324642]: 2026-02-20 10:04:43.257488948 +0000 UTC m=+0.060649462 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 20 05:04:43 localhost systemd[1]: libpod-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e.scope: Deactivated successfully. Feb 20 05:04:43 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 e278: 6 total, 6 up, 6 in Feb 20 05:04:43 localhost nova_compute[281288]: 2026-02-20 10:04:43.291 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:43 localhost nova_compute[281288]: 2026-02-20 10:04:43.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:43 localhost podman[324658]: 2026-02-20 10:04:43.337680394 +0000 UTC m=+0.058704802 container died d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 20 05:04:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e-userdata-shm.mount: Deactivated successfully. Feb 20 05:04:43 localhost systemd[1]: var-lib-containers-storage-overlay-49096323e1fb7225bf70181bb6d0806fa8b81d0cddf9541ad891a72b23a8852e-merged.mount: Deactivated successfully. Feb 20 05:04:43 localhost podman[324658]: 2026-02-20 10:04:43.38373387 +0000 UTC m=+0.104758208 container remove d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 20 05:04:43 localhost systemd[1]: libpod-conmon-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e.scope: Deactivated successfully. Feb 20 05:04:43 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:43.407 264355 INFO neutron.agent.dhcp.agent [None req-bfcfd522-9ba2-4fa8-a8bc-49ae408e8ef3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:04:43 localhost neutron_dhcp_agent[264351]: 2026-02-20 10:04:43.408 264355 INFO neutron.agent.dhcp.agent [None req-bfcfd522-9ba2-4fa8-a8bc-49ae408e8ef3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 20 05:04:43 localhost systemd[1]: run-netns-qdhcp\x2d2e2eb23b\x2df1a5\x2d4d6e\x2d92da\x2d7b51adfa2daa.mount: Deactivated successfully. Feb 20 05:04:46 localhost ovn_controller[156798]: 2026-02-20T10:04:46Z|00439|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0) Feb 20 05:04:46 localhost nova_compute[281288]: 2026-02-20 10:04:46.389 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:46 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:47 localhost podman[241968]: time="2026-02-20T10:04:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:04:47 localhost podman[241968]: @ - - [20/Feb/2026:10:04:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:04:47 localhost podman[241968]: @ - - [20/Feb/2026:10:04:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1" Feb 20 05:04:48 localhost nova_compute[281288]: 2026-02-20 10:04:48.343 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:49 localhost sshd[324685]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:04:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:04:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:04:50 localhost podman[324688]: 2026-02-20 10:04:50.156657398 +0000 UTC m=+0.088437659 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 05:04:50 localhost podman[324688]: 2026-02-20 10:04:50.190953465 +0000 UTC m=+0.122733696 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true) Feb 20 05:04:50 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:04:50 localhost systemd[1]: tmp-crun.ozpwH3.mount: Deactivated successfully. Feb 20 05:04:50 localhost podman[324687]: 2026-02-20 10:04:50.267361656 +0000 UTC m=+0.201123947 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 20 05:04:50 localhost podman[324687]: 2026-02-20 10:04:50.334170305 +0000 UTC m=+0.267932606 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller) Feb 20 05:04:50 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:04:51 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:53 localhost nova_compute[281288]: 2026-02-20 10:04:53.348 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:04:53 localhost nova_compute[281288]: 2026-02-20 10:04:53.351 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:04:53 localhost nova_compute[281288]: 2026-02-20 10:04:53.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:04:53 localhost nova_compute[281288]: 2026-02-20 10:04:53.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:04:53 localhost nova_compute[281288]: 2026-02-20 10:04:53.381 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:53 localhost nova_compute[281288]: 2026-02-20 10:04:53.382 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:04:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:53 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:54 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:55 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 20 05:04:55 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:04:56 localhost openstack_network_exporter[244414]: ERROR 10:04:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:04:56 localhost openstack_network_exporter[244414]: Feb 20 05:04:56 localhost openstack_network_exporter[244414]: ERROR 10:04:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:04:56 localhost openstack_network_exporter[244414]: Feb 20 05:04:56 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.305722) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898305843, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1036, "num_deletes": 251, "total_data_size": 1541594, "memory_usage": 1563584, "flush_reason": "Manual Compaction"} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898315977, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1012543, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33039, "largest_seqno": 34070, "table_properties": {"data_size": 1008141, "index_size": 2065, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9542, "raw_average_key_size": 18, "raw_value_size": 999022, "raw_average_value_size": 1982, "num_data_blocks": 86, "num_entries": 504, "num_filter_entries": 504, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581838, "oldest_key_time": 1771581838, "file_creation_time": 1771581898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 10325 microseconds, and 5245 cpu microseconds. Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.316061) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1012543 bytes OK Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.316092) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.318333) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.318396) EVENT_LOG_v1 {"time_micros": 1771581898318384, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.318437) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1536419, prev total WAL file size 1536419, number of live WAL files 2. Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.319519) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353238' seq:72057594037927935, type:22 .. '6B760031373739' seq:0, type:0; will stop at (end) Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(988KB)], [51(19MB)] Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898319605, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 21455372, "oldest_snapshot_seqno": -1} Feb 20 05:04:58 localhost nova_compute[281288]: 2026-02-20 10:04:58.383 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:04:58 localhost nova_compute[281288]: 2026-02-20 10:04:58.417 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:04:58 localhost nova_compute[281288]: 2026-02-20 10:04:58.417 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:04:58 localhost nova_compute[281288]: 2026-02-20 10:04:58.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 14329 keys, 20410290 bytes, temperature: kUnknown Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898419759, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 20410290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20325987, "index_size": 47402, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35845, "raw_key_size": 384697, "raw_average_key_size": 26, "raw_value_size": 20080133, "raw_average_value_size": 1401, "num_data_blocks": 1768, "num_entries": 14329, "num_filter_entries": 14329, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 20 05:04:58 localhost nova_compute[281288]: 2026-02-20 10:04:58.419 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:04:58 localhost nova_compute[281288]: 2026-02-20 10:04:58.420 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.420095) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 20410290 bytes Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.422101) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.0 rd, 203.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 19.5 +0.0 blob) out(19.5 +0.0 blob), read-write-amplify(41.3) write-amplify(20.2) OK, records in: 14860, records dropped: 531 output_compression: NoCompression Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.422132) EVENT_LOG_v1 {"time_micros": 1771581898422117, "job": 30, "event": "compaction_finished", "compaction_time_micros": 100239, "compaction_time_cpu_micros": 54165, "output_level": 6, "num_output_files": 1, "total_output_size": 20410290, "num_input_records": 14860, "num_output_records": 14329, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898422397, "job": 30, "event": "table_file_deletion", "file_number": 53} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898425187, "job": 30, "event": "table_file_deletion", "file_number": 51} Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.319364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425277) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:04:58 localhost ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425286) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 20 05:04:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:04:59 localhost podman[324868]: 2026-02-20 10:04:59.143627492 +0000 UTC m=+0.081334183 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:59 localhost podman[324868]: 2026-02-20 10:04:59.157071432 +0000 UTC m=+0.094778103 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:04:59 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:04:59 localhost ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' Feb 20 05:05:01 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:01 localhost nova_compute[281288]: 2026-02-20 10:05:01.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 20 05:05:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3731268602' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 20 05:05:02 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 20 05:05:02 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3731268602' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 20 05:05:03 localhost sshd[324888]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:05:03 localhost nova_compute[281288]: 2026-02-20 10:05:03.422 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:03 localhost nova_compute[281288]: 2026-02-20 10:05:03.424 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:03 localhost nova_compute[281288]: 2026-02-20 10:05:03.424 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:05:03 localhost nova_compute[281288]: 2026-02-20 10:05:03.425 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:03 localhost nova_compute[281288]: 2026-02-20 10:05:03.454 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:05:03 localhost nova_compute[281288]: 2026-02-20 10:05:03.455 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:04 localhost sshd[324890]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:05:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:05:05 localhost podman[324892]: 2026-02-20 10:05:05.366139398 +0000 UTC m=+0.082119007 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 20 05:05:05 localhost podman[324892]: 2026-02-20 10:05:05.404108146 +0000 UTC m=+0.120087715 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:05:05 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:05:05 localhost nova_compute[281288]: 2026-02-20 10:05:05.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:05:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:05:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:05:06.028 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:05:06 localhost ovn_metadata_agent[162647]: 2026-02-20 10:05:06.029 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:05:06 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:06 localhost nova_compute[281288]: 2026-02-20 10:05:06.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:07 localhost nova_compute[281288]: 2026-02-20 10:05:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.455 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 20 05:05:08 localhost nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:05:10 localhost nova_compute[281288]: 2026-02-20 10:05:10.523 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:05:10 localhost nova_compute[281288]: 2026-02-20 10:05:10.525 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:10 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:05:10 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2880488553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:05:10 localhost nova_compute[281288]: 2026-02-20 10:05:10.819 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.047 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.048 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.250 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.252 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11196MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.252 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.253 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.337 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.337 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.338 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.383 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 20 05:05:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:11 localhost ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 20 05:05:11 localhost ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1236691951' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.875 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.882 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.899 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.902 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 20 05:05:11 localhost nova_compute[281288]: 2026-02-20 10:05:11.902 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 20 05:05:12 localhost nova_compute[281288]: 2026-02-20 10:05:12.903 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:12 localhost nova_compute[281288]: 2026-02-20 10:05:12.903 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:12 localhost nova_compute[281288]: 2026-02-20 10:05:12.904 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 20 05:05:12 localhost nova_compute[281288]: 2026-02-20 10:05:12.904 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 20 05:05:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da. Feb 20 05:05:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079. Feb 20 05:05:13 localhost podman[324960]: 2026-02-20 10:05:13.161903725 +0000 UTC m=+0.085010884 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7) Feb 20 05:05:13 localhost podman[324960]: 2026-02-20 10:05:13.203101693 +0000 UTC m=+0.126208802 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.213 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.213 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.214 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.214 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 20 05:05:13 localhost systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully. Feb 20 05:05:13 localhost podman[324961]: 2026-02-20 10:05:13.223325699 +0000 UTC m=+0.140447336 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 20 05:05:13 localhost podman[324961]: 2026-02-20 10:05:13.236085749 +0000 UTC m=+0.153207356 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 20 05:05:13 localhost systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully. Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.656 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.677 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.678 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.678 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 20 05:05:13 localhost nova_compute[281288]: 2026-02-20 10:05:13.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 20 05:05:13 localhost sshd[325005]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:05:13 localhost systemd-logind[759]: New session 74 of user zuul. Feb 20 05:05:13 localhost systemd[1]: Started Session 74 of User zuul. Feb 20 05:05:14 localhost python3[325027]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-9e25-3a25-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 20 05:05:15 localhost nova_compute[281288]: 2026-02-20 10:05:15.526 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:15 localhost nova_compute[281288]: 2026-02-20 10:05:15.529 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:15 localhost nova_compute[281288]: 2026-02-20 10:05:15.529 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:05:15 localhost nova_compute[281288]: 2026-02-20 10:05:15.530 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:15 localhost nova_compute[281288]: 2026-02-20 10:05:15.556 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:05:15 localhost nova_compute[281288]: 2026-02-20 10:05:15.556 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:16 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:17 localhost ovn_controller[156798]: 2026-02-20T10:05:17Z|00440|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory Feb 20 05:05:17 localhost podman[241968]: time="2026-02-20T10:05:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 20 05:05:17 localhost podman[241968]: @ - - [20/Feb/2026:10:05:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 20 05:05:17 localhost podman[241968]: @ - - [20/Feb/2026:10:05:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18351 "" "Go-http-client/1.1" Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.321 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc1a5acc-16a3-415a-bb22-47430dd09ed1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.323181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8d80af8-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '4bb8d7aa4b08c5ac2e1ac52748e6235eb9b6198176fc65ea951d9d2781487051'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.323181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8d826b4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '008853a379d2a3000938877f00bc3c42349e69496e8aefe858e487af276fac05'}]}, 'timestamp': '2026-02-20 10:05:18.353462', '_unique_id': 'c20d32d7b628424b9c14b61690a92125'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.356 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.357 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf5db138-1fa6-494c-bef4-aa57204ebc65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.356597', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8d8bc82-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '89dc85777c359a01bda9440cd6a8dec9f1106da60b21563fd962bca3b2cfb817'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.356597', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8d8d6e0-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'cb1c3b897595eca11b9fdce0c736509e744e2b0f43e44a6391e06a06d2cc7b4a'}]}, 'timestamp': '2026-02-20 10:05:18.358012', '_unique_id': '36b5a04ce672485baea05792b41e2470'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.372 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.373 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebc6d929-3679-438c-a5d8-380ed70a9fe2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.361222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8db2b7a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '91076383257015028fdd11cb1114204af5efdf2c5901fe0ef25186eec90d3ec6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.361222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8db4d26-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': 'd0d9832ce268bb96148dcfdd8c351638a1088ad467b5a473fa9e7a53e3038842'}]}, 'timestamp': '2026-02-20 10:05:18.374177', '_unique_id': '7463d041ae7b469eac68a3bb7180433d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.376 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.377 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '969cdeb7-6c56-49c5-afd9-49f75a81aa38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.376702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8dbcc10-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'fa6a7d3e1b59b6ea93970ea6008a70e456ecd084834d7f821732b8819cf7f3cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.376702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8dbe66e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'ec71536623297488afb638bec8e59d01a21d6b6730e7bda2a38926d1e119ea55'}]}, 'timestamp': '2026-02-20 10:05:18.378069', '_unique_id': '2964b6d46b2e4221b614308b502fa7f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.381 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.398 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90f79615-914a-4846-829f-2d6e93b22978', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:05:18.381593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a8df2720-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.637858066, 'message_signature': '7a338690fbb3b9e5764a1ac6a989c653db9d5f8068dbcc688eecf6904ecdabec'}]}, 'timestamp': '2026-02-20 10:05:18.399386', '_unique_id': 'aa4981ad14ce49cba38c82d43390a7ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.401 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.401 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bf7cd13-c7d0-48e0-8695-529c4ba142c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.401883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8dfa358-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '4e3b8ac6482649ac454a84bb69cfdbc37ea14c79483035ac779bd7b8da3158cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.401883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8dfbd34-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '31c1433eab048cc41d14c89737311e5b312907dcf86cf197a79ade01a3fe18cc'}]}, 'timestamp': '2026-02-20 10:05:18.403222', '_unique_id': '8f71d81c636248c28d85aabc917001e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.406 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.410 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80d0f8fe-e992-4c01-b94a-da3dfe2c8670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.406863', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e0f3a2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '41565f429267b62a3b9031a01f269fef77fef1a41e595502a022c1ebfecfa019'}]}, 'timestamp': '2026-02-20 10:05:18.411213', '_unique_id': 'fa98f04052ef4f618f70a93712b69231'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.414 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67b8143b-b1ad-453c-8f0e-e2da452dc3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.414230', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e185b0-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '0a5d7cfeb21164a31c6af2f9f06ed4cad1fc2add8a04f8604548840b38fdbb18'}]}, 'timestamp': '2026-02-20 10:05:18.414972', '_unique_id': '1b33f1ef955744dcabc8a8d229cfccc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.418 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.419 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4708b033-a101-48dd-9713-676335401b1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.418332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e226b4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '42a36472751d663c9daebf95878ff780d9efedd298658bfbb846e32ffcaca6bf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.418332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e23f50-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '25791f15f5b445e4a42593240d50b39e7606e7dde9c20b6dd5c1ab7c6bccb63e'}]}, 'timestamp': '2026-02-20 10:05:18.419692', '_unique_id': 'f7dffba734f0400ca9caa0830841f7ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.423 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5444f33d-003c-4c96-8c81-73b87fedff52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.423242', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e2e32e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'b4f5df9e9334420d8844b50feea9c872792b515980331d66ac1585a89615f8f1'}]}, 'timestamp': '2026-02-20 10:05:18.423860', '_unique_id': '58b800d1f8f545efabfd3b2f7e013438'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.426 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f7cecbe-c7a4-4def-977e-9a7988666f62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.426287', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e357e6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '0c0e76bc42114c348100d8bd6ae3183df509c25f04ead9df8429ea3f779ec9ac'}]}, 'timestamp': '2026-02-20 10:05:18.426789', '_unique_id': '5f512283d9a548b58886e12e7c7dfa29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.429 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76c71744-a304-466f-992b-d03a61cfaaa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.429101', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e3c5e6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '4f44b40701ebd6f011e899e2acca4065d746065e55d1414cf15a5cd4da050d10'}]}, 'timestamp': '2026-02-20 10:05:18.429577', '_unique_id': 'a111999803d34727890553e0ba73c296'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.431 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1979d68-ed82-4ff4-81b6-5fde694b0996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.431930', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e4345e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'c914b34863b47bfe046b67097a8c5dbc4ba34c6a3f39cce42f6b80d36ef58b09'}]}, 'timestamp': '2026-02-20 10:05:18.432403', '_unique_id': '9ae79b11c49b423596ff87b89d5c84d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.434 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 20770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8092272-0ab0-4399-8549-a0a8665914c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20770000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:05:18.434566', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a8e49cbe-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.637858066, 'message_signature': '245df19d32f1d90a0f76872581db63aefa9a6078d2c001730266c4bf7b0bda63'}]}, 'timestamp': '2026-02-20 10:05:18.435060', '_unique_id': '2cd05016d8b44c728a43597837f73d6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.437 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.437 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98bc014b-6869-4ecb-8493-48bd4f0e41b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.437528', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e50d7a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '54bac9ee305e7a5c73748d6b3711b6f6e59c1271f3d0577b618236ed72cb7d70'}]}, 'timestamp': '2026-02-20 10:05:18.437872', '_unique_id': 'f710c58cb57e4ab78efed8a8230f3c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.439 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.439 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea0e7861-12e8-423f-9dc1-85fe3ae32332', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.439178', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e54c0e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '8dd6fd81ea4a1db988bbf52ca02ee3ae7f084540ba96abbde2c58b9d9ede4de5'}]}, 'timestamp': '2026-02-20 10:05:18.439480', '_unique_id': 'ae6f20c1022249bfb54fc9d95c5d28f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bcb4361-e7ad-42e5-973c-7a254ada144f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.440847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e58d54-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '150e852923e574f8d606bad49d4661ea50a1f59f719e119accfba579b88f5ab1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.440847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e597ae-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '19ab0dcfc09885260c4733d44641d80646b99bdac05e6629ff4f9f177ac4aaed'}]}, 'timestamp': '2026-02-20 10:05:18.441389', '_unique_id': '862297e7703f45a38db1d10f234677b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.442 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.442 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28700196-8abf-4e4d-a600-ba19af2b27cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.442750', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e5d796-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'fac41ed69908adcf957d16a9bbe832e6876ae56141295f01e49e5c3e45ead572'}]}, 'timestamp': '2026-02-20 10:05:18.443042', '_unique_id': '687f2940994841b58cdab035e9fd7eb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.444 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.444 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dceb67b0-4812-45c4-be82-5df840c94ff2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.444461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e61ae4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '7d4b0d468b8192614bbd8249ffba97f02b9e59d214f8824357be9fbb68d28afa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.444461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e62782-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'e4bf3769f8086bf3e3ed58068c7dd73fe719733fc55323f54f8f0ecacdf47a54'}]}, 'timestamp': '2026-02-20 10:05:18.445074', '_unique_id': '2ef5e6837613465eb397c6f6e6f026df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.446 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.446 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.446 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37587125-7a23-466e-b97a-7b4140166098', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.446411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e66684-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '4e4550ec6e7159d38cfdae9b4b106436d272c1ee6789c1b161554f3e9aa08523'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.446411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e671c4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'bd34cb9a3eff0ada9945b1a0c2587dfaa6a71794e0890fffe75c55eb3159926a'}]}, 'timestamp': '2026-02-20 10:05:18.446971', '_unique_id': '09c0c220b4254014814cefc79cae15c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.448 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.448 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c021b471-e23f-4af9-9607-945193da0be5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.448296', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e6b0e4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'e9c0259a5c8f41acc1ea78dbc7438efc460c8b047924f2f14bfc8b24d98b3398'}]}, 'timestamp': '2026-02-20 10:05:18.448606', '_unique_id': '92ed0a6cb54641e782de61f60e4c80f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging yield Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 20 05:05:18 localhost ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Feb 20 05:05:18 localhost systemd[1]: session-74.scope: Deactivated successfully. Feb 20 05:05:18 localhost systemd-logind[759]: Session 74 logged out. Waiting for processes to exit. Feb 20 05:05:18 localhost systemd-logind[759]: Removed session 74. Feb 20 05:05:20 localhost nova_compute[281288]: 2026-02-20 10:05:20.557 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f. Feb 20 05:05:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916. Feb 20 05:05:21 localhost podman[325031]: 2026-02-20 10:05:21.153215058 +0000 UTC m=+0.084216640 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 20 05:05:21 localhost podman[325031]: 2026-02-20 10:05:21.187092292 +0000 UTC m=+0.118093894 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 20 05:05:21 localhost podman[325030]: 2026-02-20 10:05:21.199300274 +0000 UTC m=+0.134552866 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 20 05:05:21 localhost systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully. Feb 20 05:05:21 localhost podman[325030]: 2026-02-20 10:05:21.267695801 +0000 UTC m=+0.202948413 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 20 05:05:21 localhost systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully. Feb 20 05:05:21 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:25 localhost nova_compute[281288]: 2026-02-20 10:05:25.560 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:25 localhost nova_compute[281288]: 2026-02-20 10:05:25.562 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:25 localhost nova_compute[281288]: 2026-02-20 10:05:25.562 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:05:25 localhost nova_compute[281288]: 2026-02-20 10:05:25.563 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:25 localhost nova_compute[281288]: 2026-02-20 10:05:25.604 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:05:25 localhost nova_compute[281288]: 2026-02-20 10:05:25.605 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:26 localhost openstack_network_exporter[244414]: ERROR 10:05:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 20 05:05:26 localhost openstack_network_exporter[244414]: Feb 20 05:05:26 localhost openstack_network_exporter[244414]: ERROR 10:05:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 20 05:05:26 localhost openstack_network_exporter[244414]: Feb 20 05:05:26 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3. Feb 20 05:05:30 localhost podman[325074]: 2026-02-20 10:05:30.150127304 +0000 UTC m=+0.087713928 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute) Feb 20 05:05:30 localhost podman[325074]: 2026-02-20 10:05:30.163290465 +0000 UTC m=+0.100877079 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 20 05:05:30 localhost systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully. Feb 20 05:05:30 localhost nova_compute[281288]: 2026-02-20 10:05:30.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:30 localhost nova_compute[281288]: 2026-02-20 10:05:30.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:30 localhost nova_compute[281288]: 2026-02-20 10:05:30.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:05:30 localhost nova_compute[281288]: 2026-02-20 10:05:30.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:30 localhost nova_compute[281288]: 2026-02-20 10:05:30.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:05:30 localhost nova_compute[281288]: 2026-02-20 10:05:30.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:31 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:35 localhost nova_compute[281288]: 2026-02-20 10:05:35.646 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:35 localhost nova_compute[281288]: 2026-02-20 10:05:35.647 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 20 05:05:35 localhost nova_compute[281288]: 2026-02-20 10:05:35.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 20 05:05:35 localhost nova_compute[281288]: 2026-02-20 10:05:35.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:35 localhost nova_compute[281288]: 2026-02-20 10:05:35.650 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 20 05:05:35 localhost nova_compute[281288]: 2026-02-20 10:05:35.650 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 20 05:05:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb. Feb 20 05:05:36 localhost podman[325094]: 2026-02-20 10:05:36.156854606 +0000 UTC m=+0.093774063 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 20 05:05:36 localhost podman[325094]: 2026-02-20 10:05:36.16518323 +0000 UTC m=+0.102102707 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 20 05:05:36 localhost systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully. Feb 20 05:05:36 localhost ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 20 05:05:37 localhost sshd[325117]: main: sshd: ssh-rsa algorithm is disabled Feb 20 05:05:37 localhost systemd-logind[759]: New session 75 of user zuul. Feb 20 05:05:37 localhost systemd[1]: Started Session 75 of User zuul.